Skip to main content

Learn@Lunch with Professor Lemuria Carter, Head of School of Information Systems and Technology Management

Expediting e-government: How our data is being managed?

My name is Lemuria Carter. I am the Head of School for Information Systems and Technology Management at UNSW. I joined UNSW about a year and a half ago, and, as you probably can tell from my accent, I did come from the United States. So most of my professional and academic career has been in the United States, and my research started there. I'm going to share with you today a project that I undertook while I was still there, before coming here, so the focus will be on state government agency's management of data within a US state. We'll talk about what those implications are and hopefully some of that will also be relevant for Australia. As I settle into the role here, I'm definitely looking forward to learning more about the government here and about e-government, or digital government, here in Australia.

Okay, so, the agenda for today... I'm going to start off talking about my journey, just so you learn a little bit about me, before I go into the study, and it all is related. I consider myself very lucky to be able to work in a role that doesn't actually feel like work. It feels like I'm living my dream, and work feels like play. I'm able to identify interesting projects, find cool colleagues to work with, and then explore solutions to some of those problems and challenges. I do consider myself to be really fortunate. Anybody else here feel like you're working in a space that you've always dreamed of? It feels like you really enjoy what you do? Yay, hooray! Okay, I was hoping I'd get some hands. Lovely! To the rest of you, good luck on your journey.

All right, so my journey... Maybe before I start with USDA... This is going to be a short story, though when I give you the time frame, it may sound like it'll be long. Okay? All right. I knew that I wanted to get my Ph.D. when I was eight years old. Yeah, right. I won't bring you all the way up to 40 plus. Most of my family is in the medical professions. My mom's a nurse, grandmother's a nurse, my aunt is a nurse, so I figured, okay, I'm little. I'll be a nurse when I grow up. Then I realised I faint at the thought of blood, not even the sight. So, I thought, "Oh my gosh, I could never be a nurse. This isn't for me!"

I was at a dinner party at my aunt's house, going with the nurses, and she introduced me to a guy there, a guest. I don't even remember his name. We'll just make up a name. She said, "Honey, this is Dr. Johnson." You know, I'm a kid, and I thought, "Oh, you're a doctor. Do you operate on people?" She said, "No, honey, he's not that kind of doctor." I was like, "Oh, what is it?" She was like, "He's a doctor of philosophy." I thought, "Well, what is that?" She said, "When you get older, if you study something for a long time and become an expert in it, you can be a doctor." I thought, "Oh my goodness! I can't be a nurse, but I can be a doctor!" I didn't really know what that meant, but it planted the seed. Luckily, I loved academics and it all worked out.

I ended up, for my undergraduate experience, getting a scholarship from the United States Department of Agriculture and I'll tell you how that came about. The United States Department of Agriculture wanted to increase the number of minorities that worked in Ag and hence majored in Ag-related fields. You may wonder, "Well you said you're in information technology." I applied for the scholarship, was awarded the scholarship. It came with a list of things you could major in. Things like animal science, food and nutrition, a lot of things that I wasn't interested in. At the very bottom, it was computer science, and I thought, "Perfect. I'll do that."

Again, but I'm not in computer science. I get to campus first day, orientation, I sit beside some random person. I don't know who this person is, and they ask me, "What are you going to major in?" I said, "Computer science." He said, "You should think about information systems. You'll still learn about technology, but you'll learn about it in a business context." Now that part sounds really noble. The other piece that he said that was equally persuasive was, "It's a little easier than computer science." So I thought, "All right, let's switch to information systems." No one really knew what the difference was between computer science and information systems. USDA certainly didn't know. A lot of people still don't know today.

All right, here we go, I'm in information systems, interning with the USDA, United States Department of Agriculture, working on my undergraduate degree. I'm about to finish and I get a scholarship offer for a Master's programme from Virginia Tech. Now, I got a four year scholarship from USDA, so I'm supposed to go and work for them for four years after I graduate. I said, "Okay, USDA, will you just give me one more year? Virginia Tech is going to pay for my Master's degree." They said, "Okay, go ahead."

So I get into the Master's degree. We're getting closer to my goal of becoming Dr. Carter. Towards the end of that... I've told everybody. Just like you, anybody who'll listen, I tell them. I did the same when I was at Virginia Tech. I told all the teachers, department chairs, "One day I'm going to be Dr. Carter. One day I'm going to be Dr. Carter." I'm about to graduate from the Master's programme and the department chair said, "Hey, you're a great student. I know you want your Ph.D. Why don't you stay and get it with us?" I thought, "Wow." And, "We'll pay for it." This is amazing!

All right, so I go back to the USDA, "You've given me a year, will you give me four years more to work on this Ph.D.?" They said no. But I don't really take no for an answer. No is just an opportunity to find a way to get to yes. I said, "Okay, here's what I'll do. I'll complete my dissertation on something that's relevant to you." I kept asking and I guess they could see I was not going to go away, and I'd at least countered, so they said okay. That means now I need to study something related to technology in government. This was in the early 2000s, so it was at a time where government was really starting to look at how to use technology to interact with citizens, to improve business processes. So the timing was perfect.

So, here we go. I had my Ph.D. in Information Systems and I started exploring the role of technology in government. Again, luckily, all of these things in hindsight look very planned. They all kind of grew organically, and I really enjoyed the journey along the way. That is the preface for why we're here today, with me talking about how is data being managed in the public sector.

The previous institution that I worked at was Virginia Commonwealth University. It was located in the capital of the state of Virginia and we had a lot of partnerships with state government agencies. They were very interested in understanding how to use technology effectively, what were they doing right, what were some opportunities for improvement. That sparked this study that I'm going to tell you about today.

The title, I cannot take credit for. One of the wonderful people on the Learn@Lunch team came up with this very jazzy title. As academics, we usually do something really long and boring that drones on forever. So here we go. Expediting e-government. How is our data being managed?

A study came out last year from the National Association of State CIOs, NASCIO, in the United States that interviewed CIOs to understand their views on data. Naturally, many of them said data is the lifeblood of government, as it is for many organisations, and many of us in our personal lives. They talked about the need to focus more on data management, data quality, bringing all impacted stakeholders into the conversation.

The motivation for this study really doesn't need a lot of time to talk about. We know how much data is used now by everyone, and there's definitely a need for more research, and for us to understand how it's being used, and what's working and what isn't working. Many government agencies still don't have data management or data governance plans. Again, remember this is mostly focused on US government agencies. I suspect some things are still applicable here. There's a need for more studies in this area. To the best of our knowledge, the team that worked on this project, is that there hasn't been a comparative benchmarking of the performance of state government agencies around data management using a single-stage model. That was our motivation for the study.

I should say that I worked with several other people, both practitioners and academics. A former CIO was on the research team, someone who works for a big insurance company in the US, Anthem, and does data management and data analytics, and then another academic, who does a lot of work around data management, made up the team.

What we looked at were the stages of maturity for data management within state government agencies in a particular state in the US. There is some history for looking at stage models, or the progression of the use of technology within an organisation. Nolan published something in the Harvard Business Review in '79, which then later was adapted to the e-government context by Layne and Lee in 2001. Since, a lot of people have started to look at stage models, and what they mean for e-government. Maturity models emerged as way to define and quantify some of the critical success factors along that pathway to growth. In this study, we're looking at the maturity of data management within government organisations.

To date, a lot of the studies that have looked at maturity models have been conceptual, because it can be challenging to collect the data. You need access to the government agencies, and it can be quite time consuming. We go beyond just a conceptual analysis to conducting an empirical study.

Here we have the problem statement. State agencies data management current status and level of maturity is unmeasured. So we really started from zero. We can't benchmark if we don't even know where we stand. In order to address this, we conducted an initial data management maturity model. We used the CMMI. Anybody heard of it? Yeah, I saw some down there. It's lovely. That is the Capability Maturity Model Integration. It was designed by Carnegie Mellon to help organisations evaluate processes and process improvement. It can be applied to a single project, a division, or an entire enterprise. It's a very comprehensive assessment of an organization's processes. That model has been adapted to focus specifically on data management maturity, so the DMM, data management maturity. That's what we're looking at here.

We worked with 15 state government agencies. It says "State X." When we published it, we anonymized it, but we do have permission to disclose the state, so it was the state of Virginia. The goal was to present some actionable recommendations to those state agencies based on what we found. There were 15 agencies, including the Department of Motor Vehicles, Department of Transportation, Health Professions, Department of Education. A range of different types of state agencies were included. We thought Virginia was an appropriate state to use. It's ranked in the top third of Gross State Product per year in the US.

How did we collect the data? First, we started with a key informant approach. We met with a senior official in the state, talked to this person about our project, received recommendations on who else we should interview. Who were some of the key players, what were some of the key agencies? That official identified the 15 agencies that we worked with. The people that we talked to, we wanted them to be the ones who were most involved or knowledgeable about data management practises in their respective organisations. They were people such as the CIOs, or maybe the director of data quality management. We collected the data through in-person interviews. They lasted about two hours. We, prior to the interviews, collected documentation on how the organisation managed its data, sent the interviewee a list of questions so they could anticipate how the conversation would go, and we reviewed some of their documents, their strategy plans around data management.

Coming back to the data management maturity model, the DMM, it assesses five categories, looking at things like your data management strategy, data operations, and data quality. Then for each of those five categories, there are process areas within those. It's a very comprehensive measurement. I'm just giving you a very high-level overview of the instrument and the tool.

It's looking at those five practise areas and we're really interested in the maturity level within each of those. We would go from one, where whatever the process is, we're measuring data management strategy, it's performed, there is a strategy in place. Up to, maybe level three would be the practises are standardised, consistent. All the way up to optimised, and we even have a process for improving the data management capabilities. We set the goal, or the ideal, level of maturity at three, for a few reasons. The CMMI, which this data management maturity model is based on, recommends a three. The US Department of Defence, requires its contractors to either perform at this level, level three, or be able to get to that level within 18 months. That was, essentially, our target. When we analysed the agencies, we really were curious to see how many were at the defined level.

The interview responses and the work products, or the data management strategies and documentation, were scored using a 126-item survey instrument. We're looking to see whether they attain the practise, and at what level, full, partial, and how mature they are at that practise.

Just to give you a little example, if we were to assess the functional practise survey item, this would be one item out of the 126, a data management strategy representing an organisation-wide scope is established, approved, promulgated, and maintained. We would rank where that organisation is along that statement based on the interview and whatever other documents we received, their strategy, list of data management objectives, their policies, programme metrics, analysis, analysis results. This is one out of the 126 items, so how in the world did we do this?

There were four co-authors on this paper. As I mentioned before, we conducted this in the state of Virginia, and we had a really strong partnership with the governor's office. There was already an established governor's data interning programme that was looking for opportunities for these interns. They were professionals that maybe went back to school working on a post-graduate degree, looking for opportunities for them to understand data management practises. For our executive Master's degree in Information Systems, this was their capstone project. That gave us a team of people that we could oversee and they could comb through all the documents. They could collect the data and then we managed the process, because it was quite massive.

We looked at three things when we analysed the data. First, interagency analysis, so at the enterprise level, just the overall landscape of data management across those 15 state agencies. What does that look like? Then we looked at the intra-agency analysis, so we were really just trying to see what's the most common maturity level for an agency. How does that one particular agency perform along the standard? Then a cross-case analysis, where we would compare two different agencies along some of the key process areas for data management maturity.

I'll give you a little snippet of each one of these, just so you get a taste for what they look like. Don't try to read this, okay? This is just to show you at a high-level where we have all the process areas and then all of the organisations and their scores on that scale of one to five. Process areas, organisations, and their scores around those process areas on that scale of one to five. One being performed, five being all the way up to optimised and doing a great job.

This makes it a little easier to ingest. We have a circle. The outside of the circle is level three. That would mean that for each of these process areas, if these blue lines reached the outside of the circle, that would mean for data communications, people right now are performing at 1.5. Nothing reaches a three. Very clear to see that we're still in the early stages of getting a handle on data management across all 15 government agencies that we looked at. Some of the ones that are performing at a higher level than others are architectural approach, data management platform, historical data retention archiving. Understandably with records management, they have paper-based systems, and those probably were some of the first things to be automated, so that is happening at a better level than some of the others. Data management strategy, we still have some work to do.

In order to compare those 15 agencies, we came up with a weighted index so that we could compare one agency to another. That was what all that slide said. I won't go into those details. When we looked at the comparison, the two that came out on the top across all of those five process areas for data management maturity were the Department of Education and Department of Motor Vehicles. At the time, the Department of Motor Vehicles, in particular, had a very progressive and charismatic CIO that really drove a lot of initiatives forward.

We looked at interagency. The next one we looked at was intra-agency. Again, all I want to highlight is that around the five process areas, one agency can have a data management strategy perform at level three, but for data quality, can perform at a different level. It's not a uniform assessment. We looked at all of the different areas of performance.

Then for the cross-case assessment, in this example, we just took the Department of Motor Vehicles and Department of Transportation and compared how each of those performed. Some of the things that stuck out with the biggest differences between them would be around data management strategy, where the Department of Motor Vehicles is performing at a higher level, but no one reached the target. Making a business case, again, there was a very charismatic CIO that was able to help with that data quality and quality assurance. Again, we have a tonne of data that we're excited about going through, but it does take a lot of time. I'm looking forward to being able to delve even deeper into some of this data.

In terms of discussion, all of the organisations clearly want to perform at a high level as it relates to data management, but we still have a way to go. The report, we're hoping, can be a tool to help them move forward, to identify for them the next steps. Common things, most state agencies have fairly low data maturity, so initial efforts probably should be focused on maintaining the second level. We're not even quite ready to shoot for the third level yet, for most agencies. Most of them need to develop and implement an agency-wide data governance process, establish agency-wide standards, and implement agency-wide metrics for those standards. Those are some of the high-level recommendations for each agency. We reported back to them along each one of the process areas.

I'm just showing this slide as an example. So, around data management strategy to the Department of Conservation and Recreation, one of the things they need to do is establish a vision statement. For each agency, the next steps and the recommended action items are different, but this is just to show you that around each category for each agency, there are a set of actionable recommendations.

Overall, what are the implications for practise, for state governments? Level three, achieving it might sound simple, but the path forward can be quite complex. Sometimes there isn't the skill or the funding necessary to really move forward at the pace they would like, so we're recommending an incremental approach, prioritising resources. Even among agencies that have initiated data management processes, sometimes they're still a bit vaguely described. In terms of research, it's important for us as researchers to bring something new, respond to an existing gap or an area where there isn't enough work being done. Essentially, that is what we're doing here. For practise, we're really happy to be able to present those agencies with actionable recommendations and some insights customised for them.

There are a couple of limitations which present opportunities for future research. Agencies were assessed by separate teams of interns, so there could be some fluctuation in the way people scored them. To try to account for that, we did send our assessment back to the managers that we talked to to see if they agreed with our assessment. Where there were some discrepancies, we would work through that.

The model captures the maturity of various processes and process areas, but doesn't evaluate the scope or complexity of the data in use by the agency. It's just one very standardised approach and metric that we go through. A more nuanced approach to examining each organisation would be good, but because we were interested in making comparisons across agencies, we used the standardised instrument. Another thing that doesn't happen in this maturity scoring process is it doesn't account for the rate of change. Some areas change at a different pace than others, but it's not accounted for here.

In conclusion, as we know, increasingly government agencies are placing more importance on training and assessing data management in the US. The federal government recently passed a law to support evidenced-based policymaking based on open government data sets. So, again, making sure we have data quality is very important. With that said, I think we are very close to time. This study has been published in Government Information Quarterly. It was published last year online, and it'll be forthcoming in print this year, if you're interested in learning more about the project. Thank you.

Q+A led by Peter Leonard - Professor of Practice, School of Information Systems & Technology Management, School of Taxation & Business Law

Peter:               So we're going to go for Q and A. I might just have to kick it off by asking Lemuria two questions.

Lemuria:           Two? One at a time, Peter.

Peter:               The first one is, I'm always intrigued with these kind of maturity assessments deeply with the government, whether they tell you the truth, and how you know that they're telling you the truth. Because I would have thought that the relevant people have a real incentive to talk up how well they're doing, particularly when they're being compared to their peers. So, the first question is-

Lemuria:           I'm going to go ahead and answer that first one. Okay, so I think that is a great question. There's no way to ensure it, but we feel relatively confident. One, because the level of performance was so poor. I would hope if they were embellishing, they would have at least made it to three. Right? There's a sense that we really don't know what we need to do, and we want to know, and so in order to improve, they have to first take an accurate assessment of where they are. They genuinely wanted to know where they stood and how they could move forward, and the results were confidential-ish. They then gave us permission to share it with others, but I think they were genuinely interested in knowing where they were and how to move forward. If they were untruthful, then I'm very concerned about the actual state of data management in Virginia.

Peter:               I've got a second one. There's a lot of angst at the moment, well in academia, but also increasingly in the media, around algorithmic decision making by the government. Of course, you need three things to make algorithmic decision making work. You need data, you need humans to translate data into algorithms, and you need algorithms. If they're failing so much at the first stage of the quality and assessment of the data, are we at all ready to look at algorithmic decision making by government, or should we be very worried?

Lemuria:           I don't want to incite panic, but I do think we still have a ways to go. I think there are a whole lot of challenges and risks associated with algorithmic decision making, and this group of agencies definitely was nowhere near ready to start implementing that.

I don't think there's anything that could stop someone from doing it, but I don't think anyone is really trying to do that just yet. They're still trying to get a handle on the fundamentals. But I do think those discussions will continue to increase, and as technology moves forward, it's usually a little bit ahead of where we are ready to keep up and deal with. Hopefully, I would hope the government would continue to exercise caution as we move forward with technological advancements. For now, that group is really just trying to get a fundamental data management strategy and processes in place.

Peter:               Questions from the floor?

Speaker 1:        All right, so my name is Vlad. I work in a consultancy and tech advisor and guarantor. I actually saw that we basically supported the research as well in some point. My question is, through the research, did you derive to the reasons why the lack of that maturity is so high? Basically, is it the lack of leadership to lead the whole thing, or a lack of understanding where are we going, basically? An end point where we are trying to reach. Thank you.

Lemuria:           Yeah, thank you. Thanks for your question. I think the reasons vary a bit by agency, but one common challenge was funding and support. They were still trying to perform whatever their basic function was for the citizens and for the state, and then also be able to try to do a good job in this space. It varied a bit, but funding and resources continued to be a challenge.

Speaker 2:        So I'm surprised that it's a lot less far down the track than I thought it was. I thought we should be more worried about things like AI than whether the data was actually being collected and analysed effectively. Would you say that business, that the corporate sector, is much further ahead than government on this stuff? Or do you think that if you looked under the hood of the information systems of a lot of our corporates that it would be similarly behind the eight ball?

Lemuria:           Without trying to make a sweeping generalisation, but in general, probably. Probably business is a little ahead of government on a whole. I welcome feedback from some of my other IT professionals in the room. That is a generalisation, but overall, I would say so. But even within industry, very few organisations have reached the top level along that DMM. I can't remember which ones have, but only a few. It's definitely a work in progress for most organisations. I do think the government is a little further behind. I do think we should be cautious about moving forward too quickly with some of the more advanced technologies. I do think we still need to focus on the fundamentals in government.

Lemuria:           I presented this at UQ a few months ago, and one of the faculty responded that, this also in addition to research, had interest in pedagogical implications because, in higher ed, we're trying to make sure students are equipped to meet industry demands, they know the latest tools, the latest techniques. But there are still organisations out there that really need to understand the fundamentals.

Speaker 3:        Thank you. Just to follow up on that last question, I'd be interested in your reaction to the fact that the state government recently sold off the Land Titles office to be managed by private industry. This is a very mature sort of system.

Lemuria:           Okay.

Speaker 4:        We have a thing in Australia called a Torrens system, which means that the government guarantees title to land. That really is a very fundamental element in the financial maturity of the country, unlike in the US, where I gather you actually take out insurance against your title being challenged somehow or other. Anyway, the point is that the government recently sold off, for quite some huge amount, $1.5 billion, I think it was, in order for it to be managed by private industry. With this money, they are now pulling down and rebuilding a football stadium, which is fantastic, of course, for those interested in football, but I don't think it's all that good for those who are interested in security of title.

Speaker 5:        Pulling on that last question, what is the proof that private industry is more capable of managing data of such value and importance for the security of land ownership and other such systems? Thanks.

Lemuria:           Yeah, thank you. Again, I wouldn't want to encourage that sweeping generalisation that private is always better than public. Just, in general, they typically have a few more resources and are motivated by different things that kind of move them forward a little bit quicker and in different ways than government. That is a complete generalisation. It doesn't mean that there aren't exceptions, and I'd welcome, Peter, your feedback on that because you're familiar with this space more, too.

Peter:               I think it's highly sector specific. Certainly from my experience, for example, financial services, telecommunications, are much more advanced than some sectors that you would expect to be more advanced. In particular, insurance, which is coming from a long way behind. Some of the providers, like IAG, who have really engaged with data maturity and data management, data use effectively. I think there's a huge range out there.

Peter:               The difficulty, I think, with outsourcing, is that government doesn't always put the right criteria in place around assessment of the outsourcer to actually evaluate, and if need be, require organisations to develop the appropriate level of maturity to give the confidence that you're seeking. I don't think it's a simple question of is the private sector better than the public sector. I think often the question is... Well, the sell-off of the asset to the private sector creates an opportunity for government to force a transformation and cause the private sector to do it better than the government did. But government is not often well-advised or experienced enough to create the right conditions to ensure that that transformation occurs.

Speaker 6:        Hi. Just to take it back a bit to the state government agencies, what do you think, from your work, would act as... So all of these agencies are doing their business as usual, and they're trying to improve the data management alongside so that they can do their business better, but generally their everyday takes over. What do you think acts as incentives, or could act as incentives, to improve that, to move that along? Because usually organisations only work when there's an incentive to work towards.

Lemuria:           Right, so it needs to be something that's embraced from the top of the organisation. For this particular group of 15, we've continued to work with them, so each year, we're reassessing, so we hope... You know, as you keep meeting and keep talking about it, and somebody's coming back to look at what you're doing, you hope you'll be a little further along when they come back. So one of the next steps is then to compare where they were at this point in time, this was in 2017, to where they may be now. But, it is a mountain of data to get through. Hopefully we'll be able to come back as a team to look at it, but I do think it being important from the top and reinforced will help it move throughout the organisation.

Lemuria:           I think there was a question over here? Oh, one over here. Okay.

Speaker 7:        Just wondering, with the great interest in technological capabilities and analytics and data-driven insights, do you see the desire for insights being the key driver for better data management? And, perhaps the FSI sector is an example of that actually occurring, given the low level of maturity that we currently see in government.

Lemuria:           I would hope so. It's hard to say, but I definitely hope so. Again, specific to this group, a lot of the people were very passionate about delivering good services in whatever area they were in, and they understood that this was an important part of that and this would enable them to perform even better. They were very passionate about doing a good job for the citizens and for whatever area they were managing, and saw this as one tool that could help them be better. So hopefully the desire for those insights to continue to perform better will drive some of this forward.

Speaker 8:        Thanks. My question was around the key informants. As a qualitative researcher, I'm interested in what they had to say about the tech vendors that they're dealing with. We're kind of trying to deconstruct that binary between the private sector and government, and they're actually very symbiotic because government departments are very much reliant on private sector technology products. I'm wondering if the levels of maturity are very much dependent on the kinds of products that these agencies are reliant on. As a practitioner, I've seen organisations have to contort themselves according to the requirements of these products. So, yeah, I'm wondering if you could say a bit more about that.

Lemuria:           Yeah, thank you. No, I think that's an excellent question. I actually don't have an answer for you, but I wrote it down because I do want to follow up. I'll have to go back and get into the data and see how that teased out in the relationship between the tech vendors and how that impacted their maturity. Yeah, thank you. I appreciate it. That's going to be a note for future research.

Lemuria:           He's been waiting patiently, too. Yes, okay.

Speaker 9:        Excellent study. I really like that. Couple of things. Did you have any insurance-related state agency in your list of 15 departments?

Lemuria:           Off the top of my head I don't-

Speaker 9:        The second one is, did you also get any insights into Anthem's maturity, and did you assess that?

Lemuria:           We didn't. We tried to just focus on those 15 state agencies. We didn't look at Anthem. I will check my list and let you know about the insurance. I do have it in here somewhere. I can't remember off the top of my head.

Speaker 10:      Thanks. One of the questions, the lady used the phrase "looking under the hood," and I was thinking, as an analogy, autonomous vehicles. We know the data comes in, but they're not ready to do right-hand turns and they're still running over pedestrians. If you could just, as an analogy... If I'm not in this data management area, when it goes wrong, how wrong does it go for the private citizen compared to the autonomous vehicle?

Lemuria:           Very wrong, in that example.

Speaker 10:      What sort of things can go wrong in data management for the private citizen? Who's affected?

Lemuria:           I really wouldn't know where to start with that question. It would just depend. There's so many things could go wrong, but let's say so many things could go right. That is why we need to move ahead with caution, but it does depend. So, in your example, death, if the autonomous vehicle runs over you. In others, maybe there's a loss of privacy, maybe your private information is disclosed, which could then lead to other undesirable events. That is a rabbit hole more so than even under the hood.

Peter:               Yeah, look it's an interesting question because it invokes both the questions of the quality of the data and the quality of the algorithms and the quality of the humans that are applying the data to determine an outcome. Really, what this study focuses on is the quality of the data that is the key input into, potentially, a human making a decision. If you overlay that with a machine making a decision, then you've got a whole new layer of issues, or prospective issues, but you've still got the same underlying question of the quality of the data.

                        What can go wrong with government? Well, lots of things can go wrong. A classic in the US is that there's comprehensive legislation against racial discrimination, but it is possible to use things like geography, geolocation, as a proxy for race. If, for example, you know a particular neighbourhood is predominantly Hispanic or African-American, then the use of geolocation as proxy can be used to effect a discriminatory outcome without looking at race at all. There's all of those complicated issues around possible discrimination, not to mention the sort of inconvenience that inappropriate targeting might create.

                        Exhibit A in Australia is robodebt. Robodebt was not so much a data problem as a data interpretation problem that didn't fit their development of algorithms that led to bad outcomes. Being people of limited literacy or understanding receiving letters that looked like it was clear that they owed the government money, when in fact, in the case of many of the recipients, the calculation was not based on good data.

:                       For those not familiar with it, just very quickly, the issue there was, sending letters to Centrelink recipients who appeared to have earned income in the relevant year telling them that they had to pay back their Centrelink payments because they'd earned income. Whereas, in fact, it turned out that the data sets that were being analysed were not sufficiently granular. So if you had a situation where, for example, you had been on Centrelink payments in the first part of the year, perfectly entitled to Centrelink, and then Centrelink had actually done what it was meant to do, which is help you get a job. You got a job, but you started paying tax in the second half of the year. The next thing that you got was a letter saying pay back the Centrelink payments when you were perfectly entitled to them and Centrelink had actually done the job of getting you a job.

                        That's an example of, even if you've got the data right, you have to be careful as to how you use it and translate that into outcomes. You have to be particularly careful when you're using it in relation to individual citizens that might have limited understanding, and you're presenting this as though it's scientifically valid, when in fact, it might be based on pretty bad data analytics.

                        Sorry, you've probably got a lead of examples like that, as well.

Lemuria:           You're good to go.

Speaker 14:      Data security and data hacking is probably one of the biggest things that has been getting a lot of publicity recently. How important is that in your study, and is that something that is important on going forward with these instrumentalities?

Lemuria:           Yeah. Let me see, I think I have a slide that shows what's included and what isn't included. Here we go. While data security is important, it is not inside this DMM. Definitely important, but again, we had to scope it so that we could get through the process. Data security was one of the things that was not included in the study. Not because it's not important, but just because we put a scope around it, and we used this model, which doesn't include it. Good question.

Speaker 15:      Before, you mentioned that you're still working with the government departments still in Virginia. I guess, now that you've got the results, and you are continuing to work with them, what approach are you taking? Are you still modelling it off the DMM that used?

Lemuria:           We're still using it, yeah.

Speaker 15:      Okay, so you're actually using that as a basis in which to try and move the organisations forward into a higher level of maturity?

Lemuria:           Well, not actually to move them forward, we hope they do move forward, but to monitor how they are moving. We're really observing, and then we're sharing that back to them, and we hope they will use that to move forward. We're looking at how they handle it, and how that evolves over time. We haven't looked at the next set of data yet.

Speaker 15:      Are there any guidelines that are available, that could use to sort of measure how they go aside from that measurement? Does the US government have some sort of guidelines on how they could go about improving their situations?

Lemuria:           So, yeah, we made specific recommendations to each agency for what they should do next based on where they are. Here are some things that you should do to move forward to the next stage. And after each time, we will give them that feedback. We'll present them with some steps they can take to continue to move forward for each one. Overall, I'm not aware of overarching, and it's so nuanced. I mean, it really depends on that organisation.

Peter:               We promised to get you out of here on time, so I have to deliver on that promise. Thank you, Lemuria, for a really interesting presentation.

Lemuria:           Thank you.

Peter:               So, just to remind you that this talk is part of a series, and that we'd be delighted to have you along to the next in the series, which is Dr. Kate Dunn on Wednesday, the 11th of March. Dr. Dunn will be speaking on human centric medical technologies to drive greater health outcomes, something we all, especially us older people, think is a wonderful thing. Have a wonderful afternoon and thank you for coming. All the best.

Lemuria:           Thank you! That was great

Cover image
Lemuria
Featured
Off

Contact