Main Nav

Hello Metrics Fans,

 

As the new lead of the Core Data Service, my focus for the next several iterations of the survey will be to improve the metrics and benchmarking capabilities.  Currently, the survey is rich in content on many services.  Ultimately, I’d like to start defining metrics/KPIs for these services (possibly based on the standards the CEITPS develops) so that they can be included in the CDS. 

 

Wouldn’t it be great if you and all of your peers had a place to submit your standards-based metrics for specific services so that you could benchmark these metrics with your peers?  (Network Availability, for example) That’s the plan.

 

I recently identified the major services referenced in the CDS.  They are –

 

Networks

Enterprise Infrastructure (data centers)

Security Administration

Support Center

Messaging/Messaging Infrastructure (email, webmail, etc)

 

If you could pick one metric for any of these services to include in the Core Data survey, what would it be?

 

Thanks for your help!

Leah

 

Leah Lang

Senior IT Metrics and Benchmarking Analyst

Educause

llang@educause.edu

 

********** Participation and subscription information for this EDUCAUSE Constituent Group discussion list can be found at http://www.educause.edu/groups/.

Comments

Hi Leah!

 

If I could only pick one, I’ll go with Abandoned Call Rate for the Support Center (as a measure of service availability).  This data can be collected from an automated system for managing/tracking phone calls (ACD) so that you wouldn’t have to rely on human intervention.  From the customers’ viewpoint I think it would be a good one to compare to others.  Of course this should be tempered with total calls (another related measure).  I’d like to see it on a monthly trend line.


Marty

 

Message from jj014747@pomona.edu

First Call Response time, Leah.   As a recipient of services myself, if I must contact support I would like to have my question/problem resolved within one phone call or at most 2-3 email transactions.  


Julianne Journitz
Director of Client Services
Information Technology Services
Pomona College
24x7 assistance: http://helpdesk.pomona.edu
ITS Website: http://its.pomona.edu
ITS Twitter Updates: http://www.twitter.com/pomonahelp


Message from warren.apel@gmail.com

Hi Charles,
Great idea - I need to collect some benchmarking data for my tech plan, and your survey would make that easier. I'd suggest some questions to help break down tech support (fix-it) staff vs tech integration (educational) staff.  And maybe an open-ended text area for a tech staffing summary, to help clarify those schools with one person who's half-this and half-that - they sometimes show up as a 0.5 here or there but are hard to count.

I'd like to see trends in SIS selection; not just what you are using, but for how long and what you used to use before then.  Same for email systems.  

Thanks for doing this!
Warren

On Tuesday, January 3, 2012, Charles Thompson <CharlesThompson@taftschool.org> wrote:
> Happy New Year all!
> We have just under 6 months before most of us convene again for our annual conference. Last year, one of the topics that came up in the closing meeting was the usefulness of a survey that really includes a lot of details of our schools and our departments - to help as we compare what we do to others. Of course, we'd ask the mundane demographic stuff. But the question is, what else would we be interested in asking? This could be a two part survey, with part being school specific, and part being anonymous (for info like like staff salary ranges, etc.).
> I would like to see if I can put together a survey in the next month or so. Can you let me know what type of questions you would like to be included in such a survey?
> Thanks
> Charles
>
> ------------------------------------------------
> Charles D. Thompson
> Director of Information Technology
> The Taft School
> 110 Woodbury Rd.
> Watertown, CT 06795
> 860-945-7989
> ********** Participation and subscription information for this EDUCAUSE Constituent Group discussion list can be found at http://www.educause.edu/groups/. ********** Participation and subscription information for this EDUCAUSE Constituent Group discussion list can be found at http://www.educause.edu/groups/.

Message from shawt@d-e.org

Charles,

We do a comprehensive, school-wide survey of this type with an organization called JRPO, and I have historically been pretty frustrated by the way that they handle the IT portion. If you think it might be helpful to pull together a small working group to develop this survey, I would be very happy to participate. Based on my experience with JRPO, here are some items that might be valuable...

1. Anonymous data - Each participant should see where his/her own school ranks on a particular statistic, but names of other schools should be redacted. We could accomplish this by submitting data online and assigning anonymous keys to participating schools.

2. I would divide the questions into the following categories: Staffing, Budget, Infrastructure, and Impact on learning.

Items related to IT Staffing
2. Metric of IT Department Workload - As Mary correctly points out, IT staff responsibility varies from school to school. We need to capture a general sense of the workload of the staff at each school. This would include areas of responsibility, but it would also include things like the number of network services supported and expectations of response-time. It would also capture the number of hardware devices supported.

3. Metric of IT Department Staffing - This would include both head-count as well as some way of capturing the actual staffing model and areas of responsibility.

4. IT Department Salary Data

5. Metric of IT department efficiency - This would somehow combine items 2-4.


Items related to IT budget
1. Normalize and compare various common budget categories. This is obviously tricky because variations in how things are budgeted for can skew things dramatically. In general, though, I would like to see how we compare in some of the following areas:
- Annual teacher computer purchases / leases (more importantly, how often do you refresh teacher computers?)
- Annual Capital  purchases - infrastructure, classroom AV etc. (more specifically, what percentage of your overall infrastructure are you re-investing each year?)
- Annual Printing Copying expenses (as a percentage of either number of workstations or number of printers)
- Annual Repair / Maintenance (as a percentage of workstation fleet)
- IT staff Professional Development
- Teacher PD related to technology

Items related to infrastructure
- to what extent are servers virtualized?
- model used to provide computers to students (1:1, virtual desktops, computer labs, etc.)
- wifi coverage and speeds
- network bandwidth
- Student Information System
- Course Management System
- Web Site hosting
- Internet connection speed

Items related to the impact of technology on teaching and learning
- There are obviously lots of metrics here, but it might be hard to come up with some very objective ones that would be a) easy to collect and b) meaningful across different institutions. Perhaps some of the following might be a good start:
- Use of course magement tools (moodle / blackboard) to deliver course material to students
- Number of technology based elective courses (computer science, robotics, etc) and enrollment in these courses


Message from mary@princetonfriendsschool.org

I would like to be a part of this working group, Trevor.

Mary

 

Mary D'Amore

Director of Technology

Princeton Friends School

470 Quaker Road

Princeton, NJ 08540

(609) 683-1194 X32

(609) 731-7255 cell

www.princetonfriendsschool.org

 

Do some "light" reading and help save 125 million trees a year.

 

 

From: The EDUCAUSE ACCESS Constituent Group Listserv [mailto:ACCESS@LISTSERV.EDUCAUSE.EDU] On Behalf Of Trevor Shaw
Sent: Thursday, January 05, 2012 9:00 AM
To: ACCESS@LISTSERV.EDUCAUSE.EDU
Subject: Re: [ACCESS] Survey

 

Charles,

We do a comprehensive, school-wide survey of this type with an organization called JRPO, and I have historically been pretty frustrated by the way that they handle the IT portion. If you think it might be helpful to pull together a small working group to develop this survey, I would be very happy to participate. Based on my experience with JRPO, here are some items that might be valuable...

1. Anonymous data - Each participant should see where his/her own school ranks on a particular statistic, but names of other schools should be redacted. We could accomplish this by submitting data online and assigning anonymous keys to participating schools.

2. I would divide the questions into the following categories: Staffing, Budget, Infrastructure, and Impact on learning.

Items related to IT Staffing
2. Metric of IT Department Workload - As Mary correctly points out, IT staff responsibility varies from school to school. We need to capture a general sense of the workload of the staff at each school. This would include areas of responsibility, but it would also include things like the number of network services supported and expectations of response-time. It would also capture the number of hardware devices supported.

3. Metric of IT Department Staffing - This would include both head-count as well as some way of capturing the actual staffing model and areas of responsibility.

4. IT Department Salary Data

5. Metric of IT department efficiency - This would somehow combine items 2-4.


Items related to IT budget
1. Normalize and compare various common budget categories. This is obviously tricky because variations in how things are budgeted for can skew things dramatically. In general, though, I would like to see how we compare in some of the following areas:
- Annual teacher computer purchases / leases (more importantly, how often do you refresh teacher computers?)
- Annual Capital  purchases - infrastructure, classroom AV etc. (more specifically, what percentage of your overall infrastructure are you re-investing each year?)
- Annual Printing Copying expenses (as a percentage of either number of workstations or number of printers)
- Annual Repair / Maintenance (as a percentage of workstation fleet)
- IT staff Professional Development
- Teacher PD related to technology

Items related to infrastructure
- to what extent are servers virtualized?
- model used to provide computers to students (1:1, virtual desktops, computer labs, etc.)
- wifi coverage and speeds
- network bandwidth
- Student Information System
- Course Management System
- Web Site hosting
- Internet connection speed

Items related to the impact of technology on teaching and learning
- There are obviously lots of metrics here, but it might be hard to come up with some very objective ones that would be a) easy to collect and b) meaningful across different institutions. Perhaps some of the following might be a good start:
- Use of course magement tools (moodle / blackboard) to deliver course material to students
- Number of technology based elective courses (computer science, robotics, etc) and enrollment in these courses

Message from tflanagan@winsor.edu

Charles,
Here are my main concerns.
  1. Extent and use of ebooks for coursework; how did you purchase or rent? Difficulties finding books? Are you creating your own ebooks? Devices used by students for reading ebooks, etc.
  2. Ipad implementation; success and failures.
  3. Technology integrationists; Do you have any? Are they part of the tech department? Do you have tech “advocates” in the department? How are they compensated?
  4. What does “the transformative use of technology in education” actually mean?
Thanks for putting this together.
Tom Flanagan
The Winsor School
 
Message from alex.podchaski@oakknoll.org

I like the direction Trevor is going with this. I would also be willing to jump in to help whatever group is trying to take this on.

 

I also think we need to make sure we bring in perspectives on systems that we might have a hand in, but do not fall directly under our purview. Such as mobile phones, keycard/security systems, phone systems, library catalogs, helpdesk software, and the other things we do that have an effect on our infrastructure. We should also include policy and business process concepts in the survey, but that might prove to be difficult to do in much detail. Thanks Charles for bringing this up, and I am ready and willing to help in whatever way is needed.

 

Alex

 

 

Alex J Podchaski
Director of Technology

Oak Knoll School of the Holy Child 
44 Blackburn Rd, Summit, NJ 07901
Tel 908-522-8159 | http://www.oakknoll.org

 P Think before you print

 

 

 

From: The EDUCAUSE ACCESS Constituent Group Listserv [mailto:ACCESS@LISTSERV.EDUCAUSE.EDU] On Behalf Of Trevor Shaw
Sent: Thursday, January 05, 2012 9:00 AM
To: ACCESS@LISTSERV.EDUCAUSE.EDU
Subject: Re: [ACCESS] Survey

 

Charles,

We do a comprehensive, school-wide survey of this type with an organization called JRPO, and I have historically been pretty frustrated by the way that they handle the IT portion. If you think it might be helpful to pull together a small working group to develop this survey, I would be very happy to participate. Based on my experience with JRPO, here are some items that might be valuable...

1. Anonymous data - Each participant should see where his/her own school ranks on a particular statistic, but names of other schools should be redacted. We could accomplish this by submitting data online and assigning anonymous keys to participating schools.

2. I would divide the questions into the following categories: Staffing, Budget, Infrastructure, and Impact on learning.

Items related to IT Staffing
2. Metric of IT Department Workload - As Mary correctly points out, IT staff responsibility varies from school to school. We need to capture a general sense of the workload of the staff at each school. This would include areas of responsibility, but it would also include things like the number of network services supported and expectations of response-time. It would also capture the number of hardware devices supported.

3. Metric of IT Department Staffing - This would include both head-count as well as some way of capturing the actual staffing model and areas of responsibility.

4. IT Department Salary Data

5. Metric of IT department efficiency - This would somehow combine items 2-4.


Items related to IT budget
1. Normalize and compare various common budget categories. This is obviously tricky because variations in how things are budgeted for can skew things dramatically. In general, though, I would like to see how we compare in some of the following areas:
- Annual teacher computer purchases / leases (more importantly, how often do you refresh teacher computers?)
- Annual Capital  purchases - infrastructure, classroom AV etc. (more specifically, what percentage of your overall infrastructure are you re-investing each year?)
- Annual Printing Copying expenses (as a percentage of either number of workstations or number of printers)
- Annual Repair / Maintenance (as a percentage of workstation fleet)
- IT staff Professional Development
- Teacher PD related to technology

Items related to infrastructure
- to what extent are servers virtualized?
- model used to provide computers to students (1:1, virtual desktops, computer labs, etc.)
- wifi coverage and speeds
- network bandwidth
- Student Information System
- Course Management System
- Web Site hosting
- Internet connection speed

Items related to the impact of technology on teaching and learning
- There are obviously lots of metrics here, but it might be hard to come up with some very objective ones that would be a) easy to collect and b) meaningful across different institutions. Perhaps some of the following might be a good start:
- Use of course magement tools (moodle / blackboard) to deliver course material to students
- Number of technology based elective courses (computer science, robotics, etc) and enrollment in these courses

Message from richardson@rutgersprep.org

Friends,

I agree with Alex about bringing in other pieces.  In our case this includes the fact that Digital resources for administrative offices (desktops, Blackbaud, etc) comes out of departmental budgets, not out of a centralized IT budget.  

Gathering disperse data and comparing Apples to Apples is going to be tough.

Peter
Peter K. Richardson
Director of Information Services
Rutgers Preparatory School
1345 Easton Avenue
Somerset, NJ 08873
732-545-5600
www.rutgersprep.org


Charles,

I suggest keeping demographics on day versus boarding students.  Maybe a count or percentage.  I would expect that to impact IT staffing from a support and system monitoring point of view.

On a somewhat related note, it might be worth trying to get some information on the number of hours when support is available or how much support is available outside of normal school hours.  That might be difficult as it can be kind of gray at times, but it would be useful information when looking at staffing and budget (such as. support contracts).

--
Bill Campbell
Academic Technology Coordinator | Dwight-Englewood School
www.d-e.org
+1 201-569-9500 x3827
campbb@d-e.org
Twitter: BillCamp




I also like Trevor's approach to convene a small group to determine what information should be collected. Otherwise, the sum total of fifty schools' desires will create an unmanageable database. We tried this in the CT Association a number of years ago (with FileMaker), and were never satisfied with what was collected. These were lessons learned:

1. Collect data and simple information, avoiding the idiosyncrasies of individual schools. We tried to accommodate everything, and ended up with lots of text boxes that were not searchable, and went on forever in some cases. Some topics cannot be captured in a database; they are better discussed at edACCESS conferences.

2. Because schools are different sizes, absolute numbers alone won't have much value. We should include ratios as well. What is your IT budget as a percentage of your total school budget?   What is your staff budget as a percentage of your total IT budget? Those ratios will help to make comparisons more valid.

3. Think about information that will be helpful prospectively rather than retrospectively. Questions about hosting services, cloud computing, iPad or other tablet programs, BYOD  programs, schools using metrics to assess academic tech initiatives, online and blended learning programs, etc. 

Joel

Joel Backon 
Director of Academic Technology/History
Choate Rosemary Hall
333 Christian St.
Wallingford, CT 06492
203-697-2514
Sent from iPad2

On Jan 5, 2012, at 9:00 AM, "Trevor Shaw" <shawt@D-E.ORG> wrote:

Charles,

We do a comprehensive, school-wide survey of this type with an organization called JRPO, and I have historically been pretty frustrated by the way that they handle the IT portion. If you think it might be helpful to pull together a small working group to develop this survey, I would be very happy to participate. Based on my experience with JRPO, here are some items that might be valuable...

1. Anonymous data - Each participant should see where his/her own school ranks on a particular statistic, but names of other schools should be redacted. We could accomplish this by submitting data online and assigning anonymous keys to participating schools.

2. I would divide the questions into the following categories: Staffing, Budget, Infrastructure, and Impact on learning.

Items related to IT Staffing
2. Metric of IT Department Workload - As Mary correctly points out, IT staff responsibility varies from school to school. We need to capture a general sense of the workload of the staff at each school. This would include areas of responsibility, but it would also include things like the number of network services supported and expectations of response-time. It would also capture the number of hardware devices supported.

3. Metric of IT Department Staffing - This would include both head-count as well as some way of capturing the actual staffing model and areas of responsibility.

4. IT Department Salary Data

5. Metric of IT department efficiency - This would somehow combine items 2-4.


Items related to IT budget
1. Normalize and compare various common budget categories. This is obviously tricky because variations in how things are budgeted for can skew things dramatically. In general, though, I would like to see how we compare in some of the following areas:
- Annual teacher computer purchases / leases (more importantly, how often do you refresh teacher computers?)
- Annual Capital  purchases - infrastructure, classroom AV etc. (more specifically, what percentage of your overall infrastructure are you re-investing each year?)
- Annual Printing Copying expenses (as a percentage of either number of workstations or number of printers)
- Annual Repair / Maintenance (as a percentage of workstation fleet)
- IT staff Professional Development
- Teacher PD related to technology

Items related to infrastructure
- to what extent are servers virtualized?
- model used to provide computers to students (1:1, virtual desktops, computer labs, etc.)
- wifi coverage and speeds
- network bandwidth
- Student Information System
- Course Management System
- Web Site hosting
- Internet connection speed

Items related to the impact of technology on teaching and learning
- There are obviously lots of metrics here, but it might be hard to come up with some very objective ones that would be a) easy to collect and b) meaningful across different institutions. Perhaps some of the following might be a good start:
- Use of course magement tools (moodle / blackboard) to deliver course material to students
- Number of technology based elective courses (computer science, robotics, etc) and enrollment in these courses


Message from mary@princetonfriendsschool.org

Good point, Alex,

For instance, mobile phones, security systems, phone systems, library catalogs, and helpdesk software all fall under IT’s purview at PFS.  A list of all of these things would be a good starting point for this study.

Mary

 

Mary D'Amore

Director of Technology

Princeton Friends School

470 Quaker Road

Princeton, NJ 08540

(609) 683-1194 X32

(609) 731-7255 cell

www.princetonfriendsschool.org

 

Do some "light" reading and help save 125 million trees a year.

 

 

From: The EDUCAUSE ACCESS Constituent Group Listserv [mailto:ACCESS@LISTSERV.EDUCAUSE.EDU] On Behalf Of Podchaski, Alex
Sent: Thursday, January 05, 2012 3:26 PM
To: ACCESS@LISTSERV.EDUCAUSE.EDU
Subject: Re: [ACCESS] Survey

 

I like the direction Trevor is going with this. I would also be willing to jump in to help whatever group is trying to take this on.

 

I also think we need to make sure we bring in perspectives on systems that we might have a hand in, but do not fall directly under our purview. Such as mobile phones, keycard/security systems, phone systems, library catalogs, helpdesk software, and the other things we do that have an effect on our infrastructure. We should also include policy and business process concepts in the survey, but that might prove to be difficult to do in much detail. Thanks Charles for bringing this up, and I am ready and willing to help in whatever way is needed.

 

Alex

 

 

Alex J Podchaski
Director of Technology

Oak Knoll School of the Holy Child 
44 Blackburn Rd, Summit, NJ 07901
Tel 908-522-8159 | http://www.oakknoll.org

 P Think before you print

 

 

 

From: The EDUCAUSE ACCESS Constituent Group Listserv [mailto:ACCESS@LISTSERV.EDUCAUSE.EDU] On Behalf Of Trevor Shaw
Sent: Thursday, January 05, 2012 9:00 AM
To: ACCESS@LISTSERV.EDUCAUSE.EDU
Subject: Re: [ACCESS] Survey

 

Charles,

We do a comprehensive, school-wide survey of this type with an organization called JRPO, and I have historically been pretty frustrated by the way that they handle the IT portion. If you think it might be helpful to pull together a small working group to develop this survey, I would be very happy to participate. Based on my experience with JRPO, here are some items that might be valuable...

1. Anonymous data - Each participant should see where his/her own school ranks on a particular statistic, but names of other schools should be redacted. We could accomplish this by submitting data online and assigning anonymous keys to participating schools.

2. I would divide the questions into the following categories: Staffing, Budget, Infrastructure, and Impact on learning.

Items related to IT Staffing
2. Metric of IT Department Workload - As Mary correctly points out, IT staff responsibility varies from school to school. We need to capture a general sense of the workload of the staff at each school. This would include areas of responsibility, but it would also include things like the number of network services supported and expectations of response-time. It would also capture the number of hardware devices supported.

3. Metric of IT Department Staffing - This would include both head-count as well as some way of capturing the actual staffing model and areas of responsibility.

4. IT Department Salary Data

5. Metric of IT department efficiency - This would somehow combine items 2-4.


Items related to IT budget
1. Normalize and compare various common budget categories. This is obviously tricky because variations in how things are budgeted for can skew things dramatically. In general, though, I would like to see how we compare in some of the following areas:
- Annual teacher computer purchases / leases (more importantly, how often do you refresh teacher computers?)
- Annual Capital  purchases - infrastructure, classroom AV etc. (more specifically, what percentage of your overall infrastructure are you re-investing each year?)
- Annual Printing Copying expenses (as a percentage of either number of workstations or number of printers)
- Annual Repair / Maintenance (as a percentage of workstation fleet)
- IT staff Professional Development
- Teacher PD related to technology

Items related to infrastructure
- to what extent are servers virtualized?
- model used to provide computers to students (1:1, virtual desktops, computer labs, etc.)
- wifi coverage and speeds
- network bandwidth
- Student Information System
- Course Management System
- Web Site hosting
- Internet connection speed

Items related to the impact of technology on teaching and learning
- There are obviously lots of metrics here, but it might be hard to come up with some very objective ones that would be a) easy to collect and b) meaningful across different institutions. Perhaps some of the following might be a good start:
- Use of course magement tools (moodle / blackboard) to deliver course material to students
- Number of technology based elective courses (computer science, robotics, etc) and enrollment in these courses

Hi Julianne,

 

Do you mean the time it takes to resolve a problem, or the number of problems that are resolved in one phone call or 2-3 emails?

 

An example of the former (time to resolution) would be the amount of time between when the request was placed and when it was resolved.  This could be counted in minutes, hours, or days.

 

An example of the latter (first call resolution rate) would be the number of incidents resolved in the first call / total number of incident calls.  With the email example it would be the number of incidents resolved within 2-3 email transactions / total number of incidents reported via email.

 

Thanks for clarifying.  So far there have been great ideas here!

Leah

 

Message from jj014747@pomona.edu

Leah, I mean the latter.  It's a little harder to catch than the former but, among those with whom I chat or read, it can be construed as PART of an indicator of customer satisfaction.  

We've just started to try and capture it ourselves.  Our tools don't naturally capture it and our processes aren't constructed in a way that they're naturally captured.  So we're just on our first steps to see if we can get it.  The next step would be to see how that number can be utilized to measure customer satisfaction — again as part of a suite of statistics in conjunction with satisfaction surveys of course.


Julianne Journitz
Director of Client Services
Information Technology Services
Pomona College
24x7 assistance: http://helpdesk.pomona.edu
ITS Website: http://its.pomona.edu
ITS Twitter Updates: http://www.twitter.com/pomonahelp


Interesting, Julianne!  I’ll be leading an ECAR study this spring to evaluate the current state of customer satisfaction metrics.  One possible outcome of this study would be to create a standard customer satisfaction survey for IT services that all schools could use.  Eventually, results from these surveys could be submitted in aggregate to the CDS so that schools can benchmark customer satisfaction.  We’re a few years off from this type of thing, but we’re optimistic about the direction of this project.

 

Please keep me posted on the development of your surveys.

 

-Leah

 

Message from jj014747@pomona.edu

And time between ticket creation and closure is so variable based on the complexity of the reported issues or service requests that, while I track it, I am not yet convinced of its value.   

Other things we track:

  • Incidents opened in a month that are still active at the end of a month by Priority and their average age (see above)
  • Incidents opened in a month that are closed by priority (and their average age)
  • Service Requests opened in a month that are still active etc.
  • Service Requests opened in a month that are closed etc.
This set of reports helps me see if we are attending to High priority requests and incident reports in a reasonable amount of time as compared to those of a lesser priority.

We do the same thing for Urgency (where as urgency represents the impact and priority  represents the time sensitivity)

  • Number of incidents by service component in a month (service quality management)
  • Traditional aging report of number of received and those solved within certain time frames such as less than a day, a week, two weeks, etc.
  • Incident resolution by team: Average time it takes to resolve, total # issues (open, any status)

ACD reports are probably the standard reports: Abandon Rate by month, day. Date, time.  I use this to tweak phone coverage if needed and also to corroborate statistics from our ticket software (Numara).




Julianne Journitz
Director of Client Services
Information Technology Services
Pomona College
24x7 assistance: http://helpdesk.pomona.edu
ITS Website: http://its.pomona.edu
ITS Twitter Updates: http://www.twitter.com/pomonahelp


Message from jj014747@pomona.edu

This is significantly off topic but I know that some institutions do use the insta-survey features of their ticketing systems.  I am not convinced of their usefulness either as I think there would be a point of diminishing returns.  There is an audience that habitually uses the Help or Service Desk.  They are historically going to be positive in their experience or negative in their experience or neutral.  Neutral will not respond.  Positive will always be positive and negative will be negative once and then refrain from their use of the service desk.  The positive audience may also be the frequent return customers.  If they get a survey every time a ticket closes (and an angel gets its wings), either they'll come to ignore it or they'll respond the same way.

My point is I don't think a frequent and repetitive survey is a useful tool but I would love to hear other opinions perhaps in a new thread.

Julianne Journitz
Director of Client Services
Information Technology Services
Pomona College
24x7 assistance: http://helpdesk.pomona.edu
ITS Website: http://its.pomona.edu
ITS Twitter Updates: http://www.twitter.com/pomonahelp


Message from jj014747@pomona.edu

And another question slightly off topic:  What do other institutions use for their significant abandon rate?

Julianne Journitz
Director of Client Services
Information Technology Services
Pomona College
24x7 assistance: http://helpdesk.pomona.edu
ITS Website: http://its.pomona.edu
ITS Twitter Updates: http://www.twitter.com/pomonahelp


The original question asked for just 1 metric, but for a support center, I think you have to have at least 2 measures --  call answer rate and first call resolution rate -- to know how well you are doing , and both of these have to be good for you to be successful.  

For example, if you have a call answer rate of an outstanding 95%, but your first call resolution rate is only 30%, people are not going to be happy because they almost never get their question answered right away.  Similarly, if your first call resolution rate is 90%, but your call answer rate is only 50%, half of your callers are going to be unhappy right away because they did not get through to someone when they called.

So, I feel like these 2 measures have to be looked at together for either to be meaningful.

Don't be a phishing victim - VCU and other reputable organizations will never use email to request that you reply with your password, social security number or confidential personal information.  
    For more details visit
http://infosecurity.vcu.edu/phishing.html .

Jim Bostick
Director, User Services
Technology Services
Virginia Commonwealth University  
jsbostick@vcu.edu 804-827-0390
 
"Most people are about as happy as they make their minds up to be."
   -- Abraham Lincoln



Message from jj014747@pomona.edu

Jim, I couldn't agree more. 

For a Support Center, there are many measures to take and not one of them provides an answer to "Are we successfully delivering ours ervices"? 

Number of tickets and the time within which they were closed (from all sources including email AND phone AND walk-ins if applicable)
Number of calls
Number of calls answered
Number of calls abandoned/significantly abandoned

This is to get a sense of the traffic and turnaround.  But if it takes me 2 minutes to deliver a bad service…well.



Julianne Journitz
Director of Client Services
Information Technology Services
Pomona College
24x7 assistance: http://helpdesk.pomona.edu
ITS Website: http://its.pomona.edu
ITS Twitter Updates: http://www.twitter.com/pomonahelp


I absolutely agree.  Ideally we should have 3+ measures for every service.  I believe we’re dealing with a cultural change, here though – in terms of people submitting this type of data to the CDS.  So, while we would eventually like there to be 3 measures per service (for example), I think we’ll have to start small to get people into the practice of submitting this information. 

 

Message from dnickles@apu.edu

In order to attempt to avoid the problem of customers ignoring it, we've set up our system to only send a survey on every 10th closed ticket.  More random that way as well.

Don Nickles
IMT Director of Customer Services
Azusa Pacific University
626-815-3841




Message from jj014747@pomona.edu

Don,

I'd be interested in seeing your survey if you would be willing to share it.  At one place I worked we used someone from the Sociology department to assist in developing a survey — though it was a once-only survey and not a repeater.

Julianne Journitz
Director of Client Services
Information Technology Services
Pomona College
24x7 assistance: http://helpdesk.pomona.edu
ITS Website: http://its.pomona.edu
ITS Twitter Updates: http://www.twitter.com/pomonahelp


From: Don Nickles <dnickles@APU.EDU>
Reply-To: The EDUCAUSE IT Metrics Constituent Group Listserv <ITMETRICS@LISTSERV.EDUCAUSE.EDU>
Date: Fri, 6 Jan 2012 13:53:58 -0800
To: "ITMETRICS@LISTSERV.EDUCAUSE.EDU" <ITMETRICS@LISTSERV.EDUCAUSE.EDU>
Subject: Re: [ITMETRICS] Another Question for January - KPIs for the Core Data survey

In order to attempt to avoid the problem of customers ignoring it, we've set up our system to only send a survey on every 10th closed ticket.  More random that way as well.

Don Nickles
IMT Director of Customer Services
Azusa Pacific University
626-815-3841




Great conversation!

 

Perhaps part of the issues you’re (collectively) running into is the difference between data, measures, and metrics.  Most of the list Julianne provided last (Number of calls, Number of calls answered, Number of calls abandoned/significantly abandoned) are actually measures which should be used to develop a metric.  So, Leah’s need for one metric is very feasible given that those measures would be used to create the “Abandoned Rate” metric. 

 

The other issue I see raised (thanks again Julie!) is that you may have a lot of data/measures/information that you are “collecting” and maybe even reporting…but you (again the collective) aren’t sure how to use the information.  This is usually a result of looking at data and measures produced (usually by an automated system) because, well, we “can.”  Instead of thinking of – “wow, I have all this data…how should I use it?”  I suggest we start with a root question.  What do you NEED (or at least WANT) to know?  If it’s “want” and not need, you will have to determine why you want it.  If it’s a “need,” hopefully the “why” is evident.

 

After you determine your root questions, you’ll identify the needed information to answer it.  If you already have it, awesome!  But if you are missing some, you’ll be able to create a plan to gather it.

 

On the health of a Help Desk – a larger question – once you’ve defined how YOU determine health, the information you’ll need will be evident.  One option is to determine that health purely from the customer’s viewpoint (Effectiveness of the service).  In this case normally we measure:

 

Availability

Reliability

Speed (to deliver)

Usage

Customer Satisfaction

 

Within each of these categories, I recommend you have at least three “measures” to build each metric.

 

Marty

 

Thanks, everyone, for these ideas/comments.  They will be extremely helpful as I work to iterate the content for CDS 2012 this week and next.  How do you feel about a ratio of (# of 802.11 n access points) to total wireless access points as a measure for wireless network quality?

 

By the way…. Congratulations to one of our own!  Our colleague, Marty Klubeck, just had a book on Metrics (Metrics: How to Improve Key Business Results) published by Apress Media.  Marty has put a lot of himself into helping others develop measures for improvement – as is evident by his leadership on this CG and his willingness to help anyone on a one-on-one basis when requested.  His book is just another way he’s trying to help others and I know his book will be great (I’ve got a copy).  Check it out!

 

Keep the ideas comingJ

 

-Leah