Sutowo Wong, Director, Analytics & Information Management Division, Ministry of Health – Singapore
Artificial Intelligence (AI) is one of the hottest topics today. This is a good thing as AI has the ability to drive tremendous value if applied appropriately. It is not just companies that are taking advantage of AI. Many countries have published National AI Strategy. In May 2019, forty-two OECD and partner countries formally adopted the first set of intergovernmental policy guidelines on AI.
However, this attention has also generated so much hype that makes it difficult to separate what is real from what is wishful thinking. In February 2019, O’Reilly published the results from their survey on AI adoption in the enterprise. “Lack of data” and “lack of skilled people” remain key factors that slow down AI adoption within many organisations. Two other common obstacles pertain to organisational challenges: 23% cited “company culture” and 17% cited “difficulties identifying use cases.”
There are many fundamentals that need to be in place to maximise the value of AI. The fundamentals can be organised into four strategic thrusts: (1) data access, (2) data product, (3) data education and (4) data collaboration.
Lack of access to data is a key barrier to the adoption of AI in most organisations. The needs of different groups of data consumers differ. For example, basic users may not even want to see the data points, but instead prefer to see charts and infographics. Intermediate users may want to analyse derived data e.g. length of stay in hospital, whereas advanced users may prefer to analyse raw data e.g. date of hospital admission and discharge. It is thus important to structure the data based on how the data consumers are using it.
Structuring the data correctly is necessary but not sufficient. The data has to be discoverable. The common complaint from data consumers is that they are not aware of what data is available. There is often little information about the datasets. It is therefore important to develop data catalogue and make it available, similar to how Amazon provides information on the products they sell.
Data consumers need the right tools to make sense of the data that they have access to. Most organisations do not give employees the privilege needed to install software and such a practice creates friction to data consumers who usually require specialised tools. Furthermore, open source analytics tools e.g. R, Python, require the download of up-to-date packages on an ongoing basis. Some data consumers work around the challenge by downloading the data onto their personal devices and this in turn introduces data security risks. The data analysis process can be codified end-to-end, thereby allowing it to be shared and reused. There are tools that enable data consumers to work collaboratively and share codes so that institutional knowledge can be democratised.
According to O’Reilly, lack of skilled people is another factor that slows down AI adoption within many organisations. Data scientists are hard to recruit and even harder to retain. Most organisations can only afford to hire a small team of data scientists. Even for those that has the budget to hire a big team of data scientists, demand often outstrips supply. They need to galvanise all employees, including non-practitioners to perform simple data analysis, so that the limited number of data scientists can focus on complex AI algorithms. This means that organisations need to develop analytics competency framework as well as to identify and roll out analytics training roadmap to improve the analytics maturity level of all employees.
One has to recognise that it is not pragmatic to perform all analytics activities in-house. Organisations need to work with partners to augment their in-house team. Apart from commercial companies, there are many organisations in Singapore e.g. AI Singapore, A*Star, Singapore Data Science Consortium (SDSC), that one can partner with to deliver analytics programmes. To be effective, organizations need to focus on key capabilities to be built in-house and outsource the rest to partners.
As organisations expand their capabilities in data exploitation and collaboration, they need to design data governance frameworks and processes to grant data access and apply user-friendly safeguards. Organisational-wide policies and frameworks will need to be drawn up so as to share data for a wide range of analytics purposes, which will entail secure data processing, data anonymisation, validation of consent, so as to prevent or deter data breaches and misuse.
Beyond the four strategic thrusts, the ability to sharpen problem statements and accurately translate them into data and analytics requirements are important. Here is where a relatively new role called analytics translator comes in. Analytics translators scope, prioritise and manage analytics projects. They act as a bridge between analytics and business.
As organisations expand the portfolio of analytics programmes, they will need to develop and implement prioritisation framework and processes so they can optimize their resource allocation. More important than starting new programmes, organisations need to inculcate the culture of actively evaluating the performance of their analytics programme portfolio. Akin to investment portfolio, analytics programme portfolio will require continuous monitoring so that organisations can course-correct and cut loss on underperforming programmes if required. This will allow them to save the resources that would otherwise have gone to programmes that should rightfully be cut.