Data criticality and brain research

Digital information is a central player in researching brain diseases – but digital data cannot necessarily be considered a “given”.

Brain related diseases influence app. 1/3 of the population. Though much research has been done, one of the challenges is that there are differences in the quality of data. Photo: UiO.

In the second of the CDN-workshop series, discussions focused around the use of digital information in the life sciences. Digital data are needed to simulate neurological processes, to analyze large and complex datasets, to sequence genomes and modify them with engineering techniques.

One of the biggest challenges is that despite the common format, digital information does not speak for itself. Especially not in an interdisciplinary project.

DigiBrain - Large datasets to gain Insight into brain diseases

The project DIGIbrain combines experimental data from biology and neurology, as well as mathematic and computational modelling. Though each of these disciplines has been working with digital data for decades, the wide range of measurements and experimental approaches produces vast complexity of data that are difficult to compare and translate between fields. Lack of common standards and often limited understanding of the source of what is being measured  makes interpretation of results challenges within life sciences. Common understanding of underlying principles and concepts is often a challenge in interdisciplinary projects.

DIGIbrain seeks to tackle issues of data standardization to facilitate data sharing and  the interdisciplinary communication about data. Marianne Fyhn, leader of the project confirms:

- Common standards for organizing data and meta data is key – but time consuming to develop – representing resources we usually have not accounted for in our projects. Many of us rather want to focus on the core research, but to succeed, the investments in best practices for data storage and analyses will improve quality and longlivety of results.

Important to pose critical questions to data processing

This is one of the challenges that the CDN discussed further in this Tuesday’s workshop. Scholars with a background in sociology and technology suggested that the attempt to standardize information can already be considered core research.

Mareile Kaufmann, head of the CDN explains:

- In our common discussion it became clear that the many translation steps in the making of a research finding - the translation of neurological findings into mathematic variables, a meaningful comparison of mathematic and computational models, and the translation of their results back into the context of neurology - are often underestimated. As of today, these processes are not even considered central enough to warrant their own funding. And yet, they are part of creating research output.

She continues:

- The understanding of digital information in their different disciplinary contexts is essential to interpret and examine research results, and to pose critical questions to data processing. We cannot just rely on the idea that once digitized, information is a universal format.

"The aspect of translation is one that Kaufmann also explores in her latest writings on prediction algorithms. Creating transparency about predictive policing processes already begins by understanding the collaborations between police officers, engineers, software designers and the actual software components. CDN’s next workshop will deepen the discussions on the use of digital data in intelligence and police work on the 18th of December.

Here from the second workshop on brain research, left- right: J. Peter Burgess, Kjetil Rommetveit, Gisle Hannemyr, Marianne Fyhn, Mareile Kaufmann, Heidi Mork Lomell and Ann R. Sætnan. Photo: UIO.

The CDN is funded by the Research Council of Norway.




Published Oct. 25, 2018 2:40 PM - Last modified Oct. 26, 2018 9:22 AM