A cloud-based data processing and visualization pipeline for the fibre roll-out in Germany
I’m proud to announce that our paper “A cloud-based data processing and visualization pipeline for the fibre roll-out in Germany” has just been published in the Journal of Systems & Software!
To support the roll-out of fibre broadband Internet in Germany, Deutsche Telekom has set itself the goal of connecting more than 2.5 million households per year to FTTH (Fibre to the Home). However, planning and approval processes have been very complex and time-consuming in the past due to high communication overhead between stakeholders, missing automation, and lack of information about planning areas.
Telekom addresses this problem by collecting large amounts of geospatial data (3D point clouds and 360° panorama images), which can be used to automatically find suitable routes for fibre optic lines, to determine possible locations for distribution cabinets, as well as to build a 3D visualization helping to create detailed plans and to present them to decision makers. This speeds up planning tremendously, but processing this data and creating the visualization in a short time requires automation.
In this systems paper, we present a data processing platform that we have built and operated together with Telekom over the course of the last six and a half years. The platform makes use of the cloud to manage Big Data in a scalable and elastic manner. It builds upon research results from us, specifically a scientific workflow management system to automate processing, as well as Fibre3D, a web-based tool that planners can use to display the processed data and to perform fine-planning.
Besides the technological aspects, this paper also describes a practical use case that shows how the platform and Fibre3D help Telekom to speed up the planning and approval process. We also summarize lessons learned and give recommendations for the design of systems similar to ours.
With the presented technology, Telekom has been able to already connect more than 8 million households to FTTH and expects to even improve on this in the future. We consider our collaboration, therefore, an example of how well knowledge and technology transfer between research and industry can work, and, at the same time, what impact it can have on society.
More information
If you want to know more about the platform described in the paper and, in particular, our scientific workflow management system Steep, which we use to control the Big Data processes, have a look at the Steep website and the Telekom showcase.
Reference
Download
The paper has been published under the CC-BY 4.0 license. You can download the final version of the paper here.
Posted by Michel Krämer
on 28 February 2024
Next post
Generate citations and bibliographies faster with citeproc-java 3.0.0
A new version of the CSL processor has just been released. It is now a pure Java implementation and does not rely on JavaScript and citeproc-js any more. This improves performance and avoids compatibility issues.
Previous post
Steep 6.0.0
The new version of the scientific workflow management system contains many new features including an improved workflow syntax, better parallelization, workflow priorities, and full-text search. It also fixes a few bugs.
Related posts
Two new cloud-based data processing papers published
My latest research papers about “Capability-based Scheduling of Scientific Workflows in the Cloud” and “Scalable processing of massive geodata in the cloud” are now available.
Steep 5.7.0
I’ve just released a new version of my scientific workflow management system Steep. It introduces live process chain logs, improved VM management, and many other new features. This post summarises all changes.
Implementing secure applications in smart city clouds using microservices
In our paper, we describe an approach to creating secure smart city applications using the microservice architectural style. We evaluate it by implementing a web app for urban risk management.