This blog focuses on the importance of preparing the public sector more rapidly for modernization to take advantage of AI and stresses the importance of getting the fundamentals right – which must be anchored on robust data governance.
There is no escaping that AI is rapidly integrating into every business process at work, or at play – and in time, our world intelligence from increased acquisitions will consolidate deeper insights that will be used for both citizen and employee well-being.
However, our public sector institutions are failing us in terms of data governance robustness and their AI readiness in particular. Without their modernization speeding up, our national security in AI will continue to be impacted.
What is Data Governance?
In the most basic terms, data governance is the process of managing the availability, usability, integrity and security of the data in enterprise systems, based on internal data standards and policies that also control data usage. Effective data governance ensures that data is consistent and trustworthy and doesn’t get misused. These practices are key to providing the foundation for AI practice management.
The World Economic Forum (WEF) report highlighted five key roadblocks impacting the public sector on advancing data in relationship to AI practices which include:
1. Many public sector and government organizations have an elementary understanding of their data.
2. Employees often do not possess the necessary AI and data management skills.
3. The AI landscape is becoming increasingly complex and competitive.
4. There is less encouragement for public sector employees to be innovative and take risks.
5. AI algorithms require upkeep from the specific providers, which is an added cost for public sector organizations.
The balance of this blog discusses each of the five points raised in the WEF report, and also includes my insights and experiences in solving complex data governance challenges in large organizations with the focus on AI readiness.
The Effective Use of Data
Data volume is literally out of control as many organizations, and especially the private sector were never designed in the first place to handle the breadth of data hitting them like constant data tsunami waves. With so much rich data locked in unstructured documents, inefficient search infrastructures and lack of centralized knowledge centers, often only the bare basics are available to private sector organizations.
As the EU report so aptly stated, often the most simplest questions cannot be asked by public organizations.
Questions I like to ask our customers always from the get go in relationship to AI readiness are:
1.) Do you have a data officer governing all your policies, practices and infrastructure?
2.) Do you have a data governance operating model with clear stewardship and operating metrics?
3.) Do you have a data journey improvement roadmap in place?
4.) Are you running any AI applications and how many data sets are they tapping into?
5.) Do you know how many AI algorithms you have and when were they recently reviewed or externally audited? (This question usually creates quite the reaction – I have yet to find one C level than can present a report to me in less than 24 hours – this question alone could result in months of work and in some larger organizations – this may simply be impossible).
6.) Do you have a data risk management process to govern your data assets?
7.) How many databases do you have? Do they primary and secondary process owners?
8.) What data types is stored in these data repositories?
9.) Do you have a centralized data catalogue which classifies all databases so all fields, and entity relationships are well defined with clear domain owners?
These questions alone open up rich conversations on the maturity of the organization in data governance and also creates clearer context on the challenges that will lie ahead in applying AI methods.
The bottom-line, data is what fuels AI.
So if a private sector organization cannot articulate clearly how all of its data is collected, stored and is clearly defined with risk classes with strong data governance, tackling complex AI programs will be a challenge, but also it is likely any investments in AI will simply not be a sustainable operating process.
Employees often do not possess the right skills for AI and Data Management
Many public sector organizations do not have qualified, skilled and trained talent, and few have a Chief Data Officer in place, nor may have other skilled data process owners, and enterprise process champions, as it takes an entire culture to care about data as a strategic asset to get data governance heading in the right direction.
Also sourcing the required skills in data and AI are in high demand, and often the operating costs make this a major challenge for private organizations who may be better at outsourcing to skilled business process management firms, skilled in AI, like CGI, Deloitte, or my alma mater, Accenture, to name a few. But let’s not also forget that many smaller and more nimble businesses often have deeper knowledge as they are the ones on the fringe constantly experimenting and innovating, so often a mix of talent is most optimal, plus the costs will be significantly more attractive. Balance of knowledge from diverse sources in the field of AI is always a wiser pathway forward.
In addition to these realities, government employees in non-technical roles, like policy makers, department directors, and procurement officials are usually not trained in the language of data and AI. There is also a tremendous amount of legal and ethical considerations impacting privacy and security when using AI solutions, and although strong AI ethical frameworks are in place, the legal frameworks are still emerging in the field of AI and I expect we will finally start to see legislation advancing more in AI in 2023/2024 given the global efforts underway and regulatory bodies working the market gap.
Perhaps often the biggest hurdle in private sector organizations is that their functional silos are not streamlined into cross functional work processes, nor are incentives often aligned to build integrated systems to function as a unified and highly collaborative and agile organization. With cost restrictions always a reality, smart organizational design is critical to get the right behaviours advancing as you can train everyone on AI and Data but if the people practices are not collaborative and functioning effectively, new knowledge is seldom retained, let alone sustained.
The AI landscape is becoming increasingly complex and competitive.
The global AI market size is projected to grow from USD 387.45 billion in 2022 to USD 1394.30 billion in 2029 at a CAGR of 20.1% in the forecast period. The AI market is dominated by leading companies like: Alibaba, Amazon, Facebook, Google, Microsoft and thousands of small to mid-market players innovating daily in every industry class. In other words, it is a market segment dominated by heavy weights and to keep current requires dedicated resources accountable for AI technology innovations and creating regular communication to different functions to continually educate them about AI, and make them aware of AI strengths and weaknesses. Most public sectors do not have the operating budget to provide these market ecosystem sense-makers, which is another reason for partnering with firms that have skills in these areas and rely on their expertise to guide public sectors on how to modernize to be more prepared to take advantage of AI.
There is less encouragement for public sector employees to be innovative and take risks.
The public sector is known for not creating strong innovation cultures, as employees are not encouraged to take risks. An article in Apolitical states: government incentives for risk, don’t really exist. If you pull off a major improvement in service delivery, you don’t get a bump in compensation or are promoted faster.
It is difficult to bring AI forward as a core competency in many public sector organizations as AI is a transformative technology and requires agility and often a great deal of experimentation and patience to get right.
AI algorithms also require constant upkeep from the specific providers, which is an added cost for public sector organizations.
AI models never stand still and require continual monitoring to ensure models don’t drift and maintain predictive quality, with also the ongoing feeding reality of increasing data that can improve the model and of course the reality that new data sets must be cleaned to ensure they are not exhibiting data bias. Public sector organizations planning to take advantage of AI need to plan for ongoing maintenance life-cycles as AI unlike many other software products than follow a continual logic, AI models evolve over time to be consistently relevant.
In conclusion, public sector organizations to get AI right can easily experiment and deploy AI solutions, but to build a strong and robust AI competent organization, top of mind practices in data governance must be a foundational platform for modernizing our public sector. It’s one of the most important jobs to do, as countries like China are already classified as a security threat to the USA due to their speed of their data management collection and classification practices. Canada has yet to make a strong statement on the security threat of China poses to our public institutions modernization risks.
We need to understand this is not only a threat to our national security, but more importantly, we won’t be able to design and build new product innovations as fast as other countries, if our public institutions are not able to make this leap forward.
In addition, private sector leaders who are advancing AI efficiently and effectively have a responsibility to help our public sector institutions evolve and this will be the focus of my next blog.