Establishing a Data Management Program
As institutions begin to review their strategic objectives after the events of the last several months, a few critical changes come into view. If there were ever a time to ensure that the technology and people that support institutions are flexible and agile in their approaches to solve problems, this is the time. We have seen a wave of uncertainty come over higher education, in a time that was already deep with existential questions coming from inside and outside institutions.
If we lay on top of the COVID-19 situation all the demands placed on institutional data from an organizational level—enrollment projections, multitudes of budget scenarios, and a continued focus on student success—actively managing the institutional data is imperative. On many campuses, data is sprawled across the institution and often is effectively unmanaged. Recent efforts to better secure data have caused increased scrutiny about where data is stored, how it is used, who controls it, and how it is secured.
Another critical activity for most institutions is cost control. Are your data stored in multiple places, in multiple formats, on multiple servers? How might an integrated, holistic approach to the physical management of the data reduce costs? What platforms might you retire?
The Tambellini Group is engaged in a broad research agenda around the topic of data management, including the specific disciplines of reporting and analytics, data warehousing, integration, and data governance. To implement a broad stewardship and management program to secure, organize, and take full advantage of the value of the asset that institutional data represents, all these interrelated topics should be addressed in a comprehensive data management program.
What are the benefits of a data management program?
A full data management program, integrated from executives to analysts, can provide broad benefits to an institution.
- Confidence. If data consumers understand where their data comes from, who is controlling it, and the precise definitions, they will spend less time validating it again and manipulating it for their view of the world.
- Quality. The data accessed through supported channels has a measured and transparent level of quality that the data consumer can use to judge the outputs derived from the data.
- Consistency. Effective data management eliminates the redundancy of data and provides one definitive source of data.
- Alignment. The data management program should align executive data needs through the entire stack of analytics tools, data-gathering methods, and data governance activities.
- Agility. With transparency and quality comes agility. This agility includes the onboarding of new datasets and data elements into the data management program, as well as the ability to identify missing data quickly.
- Security. It is nearly impossible to secure data if you don’t know where it is and how it is being stored. As a data steward, knowing where the data you are responsible for is going is a critical step in creating a complete information security program.
- Operational efficiency. Reducing data movement and data duplication reduces duplicate IT infrastructure and operations over time.
- Readiness for change. If a significant change, like a cloud ERP implementation, is being contemplated, all the elements of this program are useful for integration, conversion, reporting, and testing purposes.
The Components of a Data Management Program
As you consider how to improve the overall state of data management at your institution, there are several components to the program that are critical. They are interrelated but distinct.
- Reporting and analysis. If we consider the significant types of reporting, operational and analytical (with financial reporting being somewhere in between), there are tools that can be used for each type of reporting and analysis. Though standardization is essential, especially for operational reporting, continued exploration of new tools for consideration is vital, given the advances made in this space in recent years.
In addition, the skills required for these activities need to be cultivated and aligned in the organization. Building a community of reporting and analytics professionals in an institution, who work together collaboratively, can provide a multiplier effect on the skills at the institution.
- Integration. Most institutions have developed a de-facto integration strategy, made up of point-to-point interfaces, using whichever tool developers are most comfortable with. Building a more cohesive integration strategy, including an integration platform at more complex institutions, will create a very agile and efficient way to integrate new technologies.
Integration is more closely aligned to traditional developer skills; this specific area is a specialty that also needs to be cultivated. A great deal of work has been done to refine the strategies, methods, and technologies in this space, including methods and technologies that can enable “citizen integrators” to properly integrate systems through low-code and low-tech tools.
- Data governance. Has your institution defined stewardship authority over specific types of data? If so, what authority do data stewards have in how the data is secured, stored, and utilized? These are complex questions that require a change in culture around data management. There are tools to help with this, but the most critical aspects are organizational alignment and leadership.
Tambellini has seen increased interest in this topic, especially as access to data and analytics are growing in demand from senior leaders. Leadership will experience frustration and distrust if the answers to their questions are slow to come, are challenging to produce, are of poor quality, or result in different answers. Data governance is critical for data-driven decision-making. Data governance is not an easy discipline to establish. (The word discipline should give a clue as to why!)
- Data warehousing. The methodologies and technologies around data warehousing have advanced in many ways over the last decade. These advances should cause every institution to review their current situation to ensure their data needs are going to be met over the coming years. Large, on-premises warehouses that take months or years to scale or augment are being replaced by data lakes and warehouses in the cloud, data virtualization technologies that can reduce the warehousing needs, and large single-platform ERPs that offer cross-functional operational and analytical reporting right from the ERP. Data visualization tools exist on their own and are embedded in many transactional platforms, offering a dizzying number of options for institutions.
All these changes and options should push institutions to rethink their data warehousing strategy, including looking at the skills they have developed. Experimenting with new technologies and providing a path for staff to learn and grow with these technologies can bridge the gap, as these skills are in high demand in the market.
How to Get Started
Each of these activities can be overwhelming on their own. Together, they can cause high stress amongst the data analyst and technology community. The roles involved in this effort, in the best-case scenario, go from developers and analysts, to front-line business management leaders, to institutional research, to executives, including presidents, provosts, CIOs, CFOs, and deans.
As with any effort with such a broad reach, different tactics can work for different types of organizations. Tambellini encourages an open conversation among leaders and practitioners, experimentation with new technologies, and time for learning about these topics before a strategy is set. One method that has shown promise is to find a small set of critical questions to use as a pilot. These questions should be actionable and ongoing. For the pilot to be successful, senior leaders have to agree that the outcome of the pilot will be used to make a set of data-informed decisions and that they will rely on this source together. Then the underlying structure can be built around just this set of data, from the data sourcing, cleansing storage, and analysis, in conjunction with data stewards and data owners.
Another activity that can help get a data management program started is to convene a group of data owners and data stewards. This group can start the process of inventorying what you have and how it is managed, creating or consolidating data definitions, and mapping out the movement of data across your campus. This can be a very revealing exercise and can build or augment the relationships between these areas that have roles in data management.
These actions start the process of change management, signaling to the campus that a change in the way data is managed has begun. They can be augmented by developing a set of principles regarding data management that these programs are built around. One principle that sets a tone for change is the notion that data is an institutional asset. Data may be managed in different parts of an institution, but the institution retains ownership. This principle begins to break down barriers between data silos and can start the cooperative process required for success.
As 2020 progresses, Tambellini will be producing research on these topics. We believe strongly that these are efforts that are critical at any time but are especially critical as you prepare for large projects in the coming year. Creating a robust data management program can ease the burden of making changes to your environment and build meaningful outcomes to improve the performance of the entire organization.