THE data boom has brought with it huge opportunities: With everyday items doubling as data sources and huge advances in capturing techniques (both sophistication and price accessibility), companies can vastly improve their decision-making and transform project outcomes.
As Jonathan Rosenberg, the former Google executive, once wrote: “Data is the sword of the 21st century, those who wield it well, the Samurai.” But for those, working with and manipulating data – as we have seen from a series of headline scandals such as the 2018 Cambridge Analytica disaster- comes responsibility.
At Gaist, we capture and analyse data about the road and roadscape. From those millions of street side images we bank each month, we extract information to help our clients better understand their assets.
In addition to being fully compliant with GDPR and ensuring all our business services and process adhere to data protection laws, we are highly conscious at an ethical level, of the data we hold and the information that could be extracted.
A constant consideration for us is non-standard use of our data by third parties. Much of the data we create, could be released as open data but there are sometimes unintended consequences that accompany the release of data. In short, a side-effect of collecting and creating the data that we have, is that we hold a lot of information that could be used by others with less-than-good intentions.
Our key defence to this problem, is to keep asking ourselves one question: “What if?”…
For example, local authorities use the data we collect to predict the performance of their assets and to maintain them accordingly. But, if a bad actor had access to this data, they could interpret it in a way which could aid “no win, no fee” insurance claim farmers.
With this in mind, we are very careful when releasing data, that the restrictions on its use are clear.
We also perform a lot of data creation for communications companies undertaking 5G and fibre roll. They rely on our intelligence for the effective planning and roll-out of their projects.
The data we collect could be used by the local authority to police the installation and check that road works are reinstated correctly.
A better way to use this data is for a joined-up approach – with the local authority and communication company working together. The communication company could prioritise the use of footways and roads that are already in a poor condition to lay their cable in, where the local authority is likely to resurface the asset soon.
Another example – We could produce a league table of local authorities’ roads to show which of the UK’s roads are in the best and worst condition. This could be great intelligence to central government to help check the efficiency of road maintenance and, for example, to prioritise funding towards those that are struggling. But it could also be used to cut funding where it is not being spent as effectively as other areas. We would much prefer to work with the highways authorities to make better use data to help decision making, rather than arm their auditors!
One thing that has not affected us yet, and we feel is unlikely to as we don’t process personally information, is the “black box” nature of AI and the biases that can unintentionally be learnt. We have seen in the news recently about the rise of racist or sexist artificial intelligence used to make decisions around vaccines or healthcare. Artificial intelligence is looking for patterns, and if one of those happens to be on a protected characteristic, such as sex, skin colour etc, it will make a decision based on that attribute – legal or not.
In addition to our internal work to ensure our data is used only for the ‘right’ purposes, we are also keen to learn from wider industry and the efforts being made by other organisations in the field to ensure location data is applied and used ethically.
Examples include Ordnance Survey’s Benchmark Initiative which seeks to influence best practice in geospatial techniques.