Public science lectures (in Danish)

Big Data and efficient algorithms

2016.02.01 | Laura Althoff Press

Date Tue 23 Feb
Time 19:00 21:00
Location Lakeside Lecture Theatres, Aarhus University

Professor of Computer Science Lars Arge, Department of Computer Science, Aarhus University

What in fact are Big Data?
Can Big Data prevent your house from being flooded by torrential rain or storms?
Why are planes flying all over Denmark and firing laser beams at the ground?
Is Google better at predicting flu epidemics than doctors?
How do you look things up efficiently in a large encyclopaedia and how do you sort out a whole lot of figures effectively?
How did the transport company UPS save ten million litres of fuel annually?

The enormous and exponentially increasing amount of detailed data following in the wake of an increasingly digitised world brings us to the threshold of the greatest information revolution since Gutenberg invented the printing press in the mid-1400s, and provided us with new opportunities to store and disseminate information via books, newspapers and other printed matter. Today we gather information via the Internet – ranging from everything uploaded from websites to social media and the myriad of sensors and other measuring devices located in many places and embedded in many products.
A new survey of the height of the Danish landscape is an example of the way large amounts of detailed data are becoming increasingly accessible and difficult to manage. With a laser-based altimeter mounted on aircraft that have flown across the whole of Denmark, the height of the Danish landscape has been measured with an unprecedented degree of precision and density. The density is actually four measurements per square metre – corresponding to 150 billion measurements altogether.
Big Data is the name given to the deluge of digital data and the new opportunities it creates in science – and society in general. This large amount of data has led to a paradigm shift. Researchers from many disciplines previously spent most of their time collecting data – now they increasingly spend their time analysing existing data to find patterns and correlations. However, the ability to efficiently process the enormous amount of data is often a challenge. It is done using algorithms – recipes for how to carry out calculations step by step. And the development of efficient and rapid algorithms that solve specific problems in as few steps as possible is often the key to new insights or products.
In this lecture, you will see examples of simple algorithms and hear how algorithm researchers at Aarhus University have developed ingenious, efficient algorithms that make it possible to use the latest, very precise laser measurements of Denmark to predict flood risks in connection with torrential rain. You will see a demonstration of a new online analysis tool where you can explore the risk of flooding in connection with both torrential rain and high sea levels at any location in Denmark.

Lecture / talk