Collaboration for a Positive-Sum Outcome: An Interview with Ira H. Fuchs
Christopher J. Mackie, Ph.D., formerly associate program officer in the Program in Research in Information Technology at the Andrew W. Mellon Foundation and currently Executive Director of CollaborationSource, a new nonprofit working to bring community-sourcing IT projects to new fields of social impact, recently talked with Fuchs about his past and current work, his thoughts on openness, and his advice for the effective use of campus IT resources.
Mackie: Ira, this is the second time you have received the EDUCAUSE Leadership Award. In 2000, EDUCAUSE honored you for "bringing extraordinary creativity, intelligence, and technological expertise to the challenges of providing electronic access to people and information in support of teaching, learning, and research." With today's ubiquitous Internet, is providing electronic access still a primary concern for higher education? What do you see as the most significant benefits—and also, perhaps, new challenges—arising from this changed environment?
Fuchs: If we define "electronic access" as access within our campuses, by full-time enrolled students, faculty, and staff, then I think most of the work is done. However, if we define it in the precise language of that first award, and especially if we add the qualifier of "effective" access, then I think there is still a great deal of work to be done. Let's confine ourselves, for the sake of brevity, to the U.S. context and simply not discuss the large fraction of the rest of the world's population that remains effectively without such access. Let's also leave aside research, which introduces a different set of challenges, in favor of concentrating on teaching and learning. Within the U.S. borders, there are still significant populations of learners and potential teachers whose access isn't what it needs to be. For example, about one-third of Americans don't have broadband at home. Unserved individuals are more likely to be poor and to have less-than-ideal employment opportunities—in short, they are likely to be the people who most need to be better served by our educational system. In that important sense, our work is very far from done.
It seems to me that the new challenge is in fact a very old challenge. Many EDUCAUSE members are land-grant institutions, having missions to serve the citizens of their states; others are community colleges, having similar missions to serve their regions. More than a few private-campus members have long, strong traditions of public service as well. Many of these campuses have run large-scale outreach programs in various fields for more than a century. Yet it sometimes seems as if we in higher education information technology view ourselves as an archipelago. As long as our campuses are wired internally, and wired to each other, then we're doing our jobs, no matter how wide and dark the spaces in between the ivy-covered walls. Those "byte-dark" seas are defined away as somebody else's problem.
I'm not sure on-campus access was ever a valid performance standard for us to use, and if it was, certainly it is becoming less so every day. If we are going to fulfill the nation's educational needs—and our national objectives—then we need students, both traditional and nontraditional, to be able to access information in support of learning anywhere anytime, in at least some variety of different fashions to accommodate their different situations. The new types of learning materials being developed require more bandwidth than text-based materials, further increasing the challenges. And of course, we can't really ignore the rest of the world. Some nations and whole regions will continue to lag behind the United States for years, but a growing fraction of the world is leading the United States each year in matters of access, as well as in various measures of educational performance in which access plays a role.
I don't mean to suggest that EDUCAUSE institutions, by themselves, can tackle these challenges. EDUCAUSE is an organization of member-institutions, and the members' abilities to provide direct, access-related services beyond their institutional walls are inevitably limited. However, we can collaborate with other institutions to ensure that no gaps in our access services are left unfilled; we can be sensitive to access restrictions that may be hindering portions of our current or prospective campus populations and can work to overcome those hindrances; and we can support government and private efforts to expand effective access. Perhaps most important, we can use the considerable educational resources of our institutions to help the public understand what is at stake if one-third of the U.S. population remains at best intermittently connected to the growing knowledge economy.
Mackie: In 2000, you stated: "Those of us in higher education can openly leverage and combine our resources. In short, we can freely collaborate." In the following ten years, through your work at the Mellon Foundation, you put those words into action. Can you talk a bit about some of the collaborative projects that you shepherded at Mellon and about why you chose to support those particular applications?
Fuchs: The higher education projects we funded at Mellon fall into two large categories. On the one hand, we had projects that involved personal—that is, student and scholarly—tools: projects like Sophie (http://www.sophieproject.org), VUE (http://vue.tufts.edu), and Zotero (http://www.zotero.org). We intended to support projects filling some of the largest gaps in technology-assisted teaching, learning, and research, especially in the arts and humanities but ideally having a wide potential range of applications, and I think the diversity of uptake, as well as the breadth and depth of adoption of the various projects, suggests that we largely succeeded.
Virtually all of the rest of the projects that we funded qualify in some fashion as IT infrastructure. Sakai (http://www.sakaiproject.org) and Kuali (http://www.kuali.org), including the Kuali Financial System, Kuali Coeus, Kuali Student, and Kuali Ready projects, are the best known, but others include uPortal (http://www.ja-sig.org/uportal), Opencast (http://www.opencastproject.org), and SEASR (http://seasr.org). We can also claim some credit for Project Bamboo (http://www.projectbamboo.org), which is just now getting started but which you and I helped to initiate, and colleagues at Mellon supported other projects, such as FEDORA (http://www.duraspace.org), that fit nicely into this space. There's no obvious domain or functional-pattern here—some are administrative applications, some focus on teaching, and some are concerned with research. What unites them all is that they are enterprise applications, intended to be integrated with the rest of the campus infrastructure. And of course, all are built on a community-sourcing model, so that they will be self-sustaining going forward.
The rationale for funding the projects varied a bit by category, but every project had to be justified in terms of Mellon's overarching mission to support the advancement of the leading edges of scholarship in the arts and humanities. In some cases, particularly the administrative systems, the rationale was indirect. We funded projects such as uPortal and Kuali because these projects had the potential, which has now been amply demonstrated, to save higher education millions of dollars per year in wasteful redundancy, software license fees, and inefficient system designs. We couldn't force campuses to accept those savings by joining the projects, nor could we force them to allocate the savings back to the arts and humanities, and that was always something of a challenge in defending those projects in terms of Mellon's mission. However, anyone familiar with campus budgetary politics will recognize that even though we couldn't say who would be the winners if the savings were realized, we could say with some confidence that the Mellon constituencies in the arts and humanities would be among the biggest losers if the savings weren't realized. Most of the other projects weren't as challenging to justify. In many cases, such as Project Bamboo and SEASR, the projects are clearly aligned with Mellon's overall mission: they directly serve artists and humanists working to advance the frontiers of scholarship in their fields. Projects like Sakai and Opencast occupy the middle ground: they engage teaching, learning, and research broadly but also support digital materials and formats that are of particular interest in the arts and humanities.
The true unifying factors for all the funded projects were infrastructure and sustainability. We focused on infrastructure because, to be honest, no one else did. In many countries, a national government funder has primary responsibility for funding IT infrastructure, such as the JISC in the United Kingdom, but the United States has no such entity. During my time at Mellon, we were the only private philanthropic program (as far as I am aware) that directed substantial resources to enterprise-level software-development projects. Some private funders were willing to support end-user applications. We instead tried to build the foundational platforms that campuses, and other funders, could leverage to pursue their missions more effectively. When we succeeded, every dollar spent on technology by anyone, on top of or around our projects, went further. Individually, the benefits weren't always obvious to a single user or funder, but our program budget was a small part of total technology funding, so collectively our force-multiplier effects were usually significant even for a single campus and were reliably huge for higher education as a whole.
We also focused on sustainability. Given the size of our program, there was no way we could afford to continue to support projects indefinitely. Even a second grant was often problematic, so projects needed to be prepared to pay their own way as soon as practicable. In the process of enforcing this, we learned a valuable lesson—namely, simple survival is the single strongest predictor of a project's social impact. If Sakai had helped the original handful of institutions for a few years and had then been replaced, it would have counted as a significant success by philanthropic standards, but its overall social impact would have been miniscule. By becoming self-sustaining, Sakai has now reached the point where it serves close to one-fourth of all U.S. college students (measured by FTE enrollment). Its cumulative economic and social impact has been immense and continues to grow. As a funder, if you have a choice between funding two programs, one with a large annual payoff but no sustaining plan and one with a sustaining plan but a smaller annual payoff, the second option will almost always achieve better results in the medium term and longer.
In fact, for me one of the most gratifying results of our efforts has been the development of community-sourcing spinoff projects not funded by Mellon, such as Kuali Ready and the forthcoming re-engineering of Sakai. Our goal was to help create a world in which these types of projects would not need external support in order to thrive, and the higher education IT community appears to be achieving that goal for us. Another gratifying result has been the robustness of the projects in the economic recession. Most of the projects have slowed their growth as institutions have hunkered down and made fewer system investments, but a few are growing even faster as institutions flock to them because of their superior financials. And all of them appear to be weathering the storm at least as well as their commercial equivalents.
The fact that many of the community-source projects supported by the Mellon Research in Information Technology program are self-sustaining and continue to provide significant benefits says something important about the power of higher education institutions to organize themselves collaboratively in order to take more control over their destinies. That's a particularly timely lesson for higher education today, now that financial pressures seem to be conspiring to erode our autonomy. There's no better time to remember Benjamin Franklin's observation: "We must all hang together; else, assuredly, we shall all hang separately."
Mackie: Speaking of hanging together—and backing up in time a bit—can you talk about your original vision for BITNET, especially in the context of the Internet as we know it today?
Fuchs: The original goal of BITNET was to connect all of the students and faculty members in colleges and universities throughout the world. As with many networks, it was built on a "field of dreams" theory: we felt certain that if we built it, they would come. There was no Internet yet, although the underlying protocols TCP and IP did exist and were being used in ARPANET. However, ARPANET was open to only a very limited number of users who were engaged in U.S. Department of Defense contracts. I thought that the ability to send e-mail and instant messages (yes, BITNET had IM) from campus to campus could have a dramatic impact on collaboration among geographically distant (and even neighboring) institutions. BITNET succeeded by connecting millions of users from more than 1,400 institutions of higher education, government laboratories, and IBM's VNET network. It was the first academic computer network to connect the United States to Japan, Taiwan, Singapore, Israel, the USSR, and most of Western Europe. It helped to set the stage for the success of NSFnet, which led directly to the Internet as we know it today.
Mackie: These days you're heading Next Generation Learning Challenges (NGLC). How would you describe the program to those who are not familiar with it?
Fuchs: NGLC is a program administered by EDUCAUSE and co-sponsored by a group of philanthropic and policy organizations: the Bill & Melinda Gates Foundation, the William and Flora Hewlett Foundation, the Council of Chief State School Officers, the International Association for K-12 Online Learning, and the League for Innovation in the Community College. NGLC works to dramatically improve students' college readiness and college completion in the United States by expanding the use of promising technology-driven solutions to more students, teachers, and schools. NGLC begins by recognizing that potential breakthrough solutions are already being developed and implemented by educators, institutions, technologists, and entrepreneurs but that too often these innovators have little access to each other or to strategies for disseminating their approaches. They need opportunities to collaborate with like-minded visionaries and complementary initiatives, to broaden the reach and impact of their solutions.
Therefore, instead of providing "pure R&D" funding, we work to accelerate the adoption of technology-enabled learning solutions that have already shown promise. We try to provide three primary sets of resources to such solutions: investment capital to help bring promising solutions to many more students; an active community of innovators and educators who are committed to driving next-generation learning forward; and support for building a body of evidence that others can learn from and use to identify and adopt the best technologies to increase college readiness and completion.
We focus on technology that can support student achievement from kindergarten through the senior year in college, with an emphasis on helping low-income students, who often face the most significant barriers to college readiness and completion. I should also note that we are far from alone in our efforts. Our initiative complements many other leading college readiness and completion efforts in policy, measurement and analysis, and financial aid.
Mackie: Indeed, the spotlight on college completion seems to be growing stronger every day with additional reports of activities in this field, including a new national target by U.S. President Barack Obama and large initiatives from the U.S. Departments of Education and Labor. What makes NGLC distinctive?
Fuchs: I think one of our key points of differentiation is the technology sophistication that EDUCAUSE brings to NGLC. NGLC focuses clearly and distinctly on the various ways that technology can enable the kinds of innovation we're seeking. The grantees that we have selected have all demonstrated considerable technical, as well as pedagogical, sophistication in their approaches to the problems they are tackling. That's particularly noticeable in some of the ways that the projects address the "not invented here" problem.
Another point of differentiation is sustainability. As I mentioned earlier, at Mellon I learned that only a project that survives is likely to have any considerable social impact. At NGLC, we stopped short of requiring a fully worked-out sustainability model as a prerequisite for funding because there are some great ideas that we wanted to have the choice to fund but that won't likely be sustainable until after they've been proved at the next level of adoption. However, even with that caveat, our program places a strong weight on sustainability from the inception of the project. I hope our focus on sustainability will make a real difference when we look back, a few years from now, and see what the various programs and projects have accomplished.
A third differentiator is our agility. From start to first award, we moved faster than any other program of which I'm aware. We owe a lot of thanks to our sponsors, including the program staffs at the Gates and Hewlett Foundations, for encouraging us to accelerate the process so aggressively in order to get money out and get real project implementations on the ground as rapidly as possible. We're also able to respond in a more agile fashion to what we learn than are some of the government funding efforts. We're less encumbered by government procurement policies, meaning that doing things like changing our Wave II panel in response to what we're learning from Wave I is easier for us than it would be for some of our government colleagues.
At the same time, I wouldn't claim that we're unique. We have some advantages, but so do the efforts that we are trying to complement. For instance, I wouldn't mind having a $500 million grant-making budget! But ultimately, I think it's far more important that we find ways to work together effectively—funder to funder and grantee to grantee. We're not in competition with our peers; we're all trying to address closely connected sets of challenges. The more we complement and support one another, the further we will all get, together.
Mackie: In the first wave of NGLC funding, you sent out a call for projects in open educational resources (OER). Openness, of content and technology, seems to be an increasingly common requirement for both government and philanthropic grants. Is higher education ready to share so openly? What's in it for the campuses and systems? What does "not invented here" mean in this context, and how, exactly, does NGLC plan to move the community past that?
Fuchs: The problems facing higher education, and all of education, are enormous. Cost increases are outpacing the ability of most Americans to afford a college education, whereas there has been little improvement in campus productivity. We've all heard, and lived, the litany of problems and complaints. Even campuses that are not accustomed to feeling financial pressure are becoming desperate to find ways to both reduce costs and improve outcomes. NGLC is one program seeking to help them find those ways.
Our premise is that the demand exists to adopt technology-based solutions that have been shown to reduce costs, improve outcomes, and provide an educational experience more in keeping with today's students—education that is anywhere, anytime, interactive, and at their own pace. To make this work, we need to break the entrenched "not invented here" thinking and accept that many courses, especially intro/gatekeeper courses, can be taught effectively using blended learning—media-rich curricula that utilize data analytics to give real-time feedback to students, faculty, and advisers and that thereby provide an optimally tailored learning experience to each student. Those curricula need not be written by the specific faculty member who delivers them, just as he/she does not need to have written every textbook or article in the current curriculum. It is sufficient that the instructor have the ability to contextualize the content by selecting the particular subset for the course, tailoring the content to local and/or personal needs, norms, and practices, and delivering it in the way with which the instructor is most comfortable.
We don't underestimate the challenges facing campus innovators who want to make this happen. The concept of faculty autonomy is an essential and deeply cherished fact of life at many campuses. What we are mounting is not an assault on faculty autonomy; it is an assault on pedagogical inefficiency. Technology can be used to increase faculty control over content, especially for introductory and gateway courses, by replacing the current "every course is an island" approach with a collaborative model in which the faculty member has a voice in the governance of the curriculum content, as well as personal control over its contextualization, tailoring, and teaching, while also getting all of the advantages of a best-of-breed curricular-development process that spans many institutions and contexts.
Mackie:In regard to challenges, will OER pose any significant technical challenges for institutions that want to make use of it?
Fuchs: At the level of a specific course and instructor, we have worked very hard, and will continue to do so, to ensure that any technical challenges will be minimal. Much of the new OER will be deployable via existing LMSs (Learning Management Systems), either without system modification or via LMS modifications that are being developed as part of the grants. These modifications may need to be implemented by a campus wanting to use some of the content, but grantees have been given strong incentives to make the work as easy as possible. We have also made standards compliance a prerequisite for funding, so standards like SCORM will be commonplace. In addition, we have insisted on license standardization, using Creative Commons licensing, so that campuses won't encounter significant license-compliance burdens.
Campuses planning to take advantage of the new OER should be aware that some of the most interesting and potentially exciting next-generation content that has been proposed is unlikely to run within their LMSs, at least at first. Much of it is interactive, involving technologies such as cognitive tutors that require their own tech infrastructure. The grantees involved with those efforts are exploring a variety of delivery options, including Web 2.0/SaaS (Software as a Service) hosting models, to ensure that no campus is excluded from using the new OER because of systems incompatibility.
At the NGLC program and national levels, there will be a few technical challenges involved, primarily with respect to curating the content and making it easily discoverable, available, and customizable on a sustainable basis nationwide. We don't have a national enterprise software infrastructure for education in the United States, so there is at present no automatic home for these materials anywhere above the level of the individual campus. That suggests a federated system in which the OER is hosted in a variety of locales, including on many campuses, and is unified by federated discovery and identity management systems. Within such an architecture, the ideal campus-level solution would have both the delivery features of a high-quality LMS and the enhanced metadata and workflow management capabilities of an institutional repository, coupled with services-oriented middleware to support campus-by-campus enhancement of the interactive OER going forward. Campuses that already have their LMSs integrating with their institutional repositories, as well as academic technology units already engaging with enterprise SOA, will probably have a slight edge in managing and sustaining whatever solutions are evolved to deal with these challenges. In fact, campuses will undoubtedly have a leadership role to play in determining what the solutions will be.
But again, our goal at NGLC is to ensure the availability of these materials to every U.S. educational institution that wants to take advantage of them and especially to the institutions at which the majority of the nation's at-risk students are enrolled. We know we won't succeed unless we minimize the technology burden on U.S. higher education institutions. So I suspect that any developments on the above-campus infrastructure will also include SaaS services to enable campuses that don't have the required infrastructure at present, and that cannot afford to buy or build it, to still engage affordably and effectively.
Mackie: Now that you've been a funder for both Mellon's Research in Information Technology program and NGLC, what do you see as the big differences?
Fuchs: At Mellon, we had fewer masters to satisfy. The NGLC sponsors have been terrific, but any collaborative effort such as this one involves a certain, inevitable amount of process. We also pursue a panel-based approach at NGLC, whereas Mellon used a "venture philanthropy" approach in which we first identified prospects through internal resources and then worked with them to develop a proposal. The panel-based approach limits us to evaluating the proposals as they are submitted, which can be a little frustrating when we receive a proposal in which the respondent has a terrific idea but that needs some relatively minor adjustment in order to move from noncompetitive to highly competitive.
That may sound like Mellon had the edge, but there are some offsetting benefits to NGLC. For example, we're funding 29 proposals in the first wave of NGLC, which is about the number of project proposals, not counting the Mellon Awards for Technology Collaboration, that Mellon's program funded in its entire ten-year history. Because NGLC is funding so many projects concurrently, we can achieve levels and degrees of inter-project coordination and cooperation that were effectively impossible for us at Mellon. It's too early to tell how well we will utilize those opportunities, but I'm hopeful they will help our grantees achieve some remarkable accomplishments.
I'm glad I had the Mellon experience before joining NGLC. I think NGLC created a better RFP because of what I learned at Mellon, and I think my colleagues at NGLC and I have designed an excellent program, blending lessons learned by both panel-based and venture-oriented funders.
Mackie: As a funder, a policy agenda-setter, and a longtime CIO, what advice do you have for today's campus CIOs? Where should IT leaders be focusing their thinking and their strategic planning? And if a provost or president is reading this interview, what advice would you give him or her about making better use of IT resources?
Fuchs: Cost-cutting is never fun—and especially not at the scale most institutions are facing today. There's no shortage of good advice out there: study what others are doing; protect the best practices; don't mow all the grass to the same height; and so on. It's all good advice, and it's all hard to follow, especially because following most of it requires excellent leadership all the way up the organizational chain—in the case of public institutions, all the way up to the legislature. Every level of the institution has to be willing to let every unit protect excellence while cutting underperformance. If even one decision-maker tries to exploit another's honest budget-cutting for its own benefit, then honest budget-cutting can easily disappear in favor of bureaucratic turf-protection.
Instead of repeating more of the same wisdom, let me focus on something that's relevant both to CIO decision-making and to my present job: the issue of collaboration. In a world where so many budget choices reduce to zero-sum or even negative-sum, multi-institutional collaboration is one of the few options that allow the possibility of a significantly positive-sum outcome. When you put a dollar into a well-designed collaborative project, you get more than a dollar of value back, sometimes much more. You don't always get the value back into your budget—that depends on the project and the nature of your investment. But you get the value back into your campus, in the form of improved outcomes and/or lowered costs for someone.
Still, collaborating on OER is different from collaborating on enterprise information technology. For one thing, you can't do it just within the IT unit. If you don't involve faculty and academic administrators as co-owners in the OER process, you will surely fail. For another, you can't assume widespread faculty enthusiasm for your collaborative ideas: sheer inertia will defeat you at most campuses, especially now that so many are focused on responding to budget crises. Instead, you need to find heroes: faculty and academic administrators who are interested in pioneering (on your campus) new approaches to teaching and learning. These need to be influential faculty. You need to make the case to your chief academic officer and to your chief financial officer. And you need to work diligently to document your results, in order to prove to the rest of your faculty that the new approach delivers the benefits you claim.
Any potential innovator has some significant resources in this effort. For one, many of your campus's peer institutions will be participating in similar efforts. To stay out may result in being left behind. For another, because they are OER, these programs collectively often represent a grant of several million dollars in curricular support to each and every U.S. higher education campus—all the campus has to do is reach out to take and use the OER product that results. There's no grant-getting process (unless you want to be part of the OER-producing projects, in which case look for future NGLC and other sources), and the only cost of entry is the cost of switching to, and sustaining, the new materials. For some institutions, those ongoing costs will be less than what they pay now; that will probably be true for every institution if the costs charged directly to students, such as textbooks, are included.
In a perfect world, all IT leaders and institutions would take a hard, careful look at what NGLC and the other programs are trying to accomplish and would decide for themselves whether we're providing you with better options than you have today. I hope we are, but if we aren't, it's important that we know sooner rather than later so that we can course-correct and help you accomplish what you need to accomplish in the way of student performance, under the real-world constraints that we all face today.
Mackie: How can funders and educational institutions together make better, more effective use of information technology?
Fuchs: The flood of new, next-generation, commercial-grade OER resulting from NGLC and the other OER programs is going to dramatically increase faculty members' choices with respect to content for their gateway and service courses, as well as reduce the personal effort required for them to provide first-rate content to their students and to accelerate student performance. The fact that the new curricular content is open-licensed is going to reduce the costs to students as well. But those increases in faculty autonomy, student performance, and student cost-effectiveness will persist only if campuses can band together to sustain what has been provided for them. Some of this high-performing OER can cost upward of a half-million dollars per course to create and involves the work of dozens of specialists. It is correspondingly expensive to maintain and enhance. Relatively few U.S. campuses can afford to pay that much for each course in the general education curriculum. And even those that can afford the cost must ask themselves whether they can assemble the necessary talent and, if so, why they would want to. Wouldn't it be much smarter to band together—not necessarily all institutions, but at least sets of a dozen or more like-minded, like-missioned campuses—to share the costs and pool the intellectual resources required to turn this one-time output of pedagogical content into a continuously improving, high-quality, sustained flow?
I stress this because I think EDUCAUSE members and their campus IT organizations and our NGLC partners can play an important role in helping academics, who may never before have seen collaboration like this, to understand its benefits. We in higher education information technology have a long, proud tradition of multi-institutional collaborations, from BITNET onward. Some of our faculty colleagues do too, especially those in the "big" sciences, but many have no firsthand experience with large-scale, multi-institutional collaborations. Academic technologists can play an important role in showing faculty colleagues, deans, and even provosts that these kinds of models can and do work, that they're not impossibly hard to create, govern, or sustain, and that the benefits they confer are real. We need to speak and act with sensitivity to our academic colleagues' sensibilities, but we should recognize, and help campus leaders to recognize, that we can be full participants and even leaders in this process, contributing valuable experience in forging durable collaboration.
Collaboration is key. I mean real collaboration: broad, deep, intensive, durable work with mutual, joint accountability for success. Individual experiments have their place, but once an approach seems promising, funders must help institutions in working together as a community to implement, enhance, and sustain that effort.
I hope many of us will take up these challenges.