New tool to improve urban disaster prevention and adaptation
Researchers from the University's BIOMA Institute are helping to develop this system, which detects local disasters based on news stories published online
31 | 03 | 2026
Major data disaster data , such as data Emergency Events Database (EM-DAT), primarily record large-scale events. However, thousands of smaller-scale floods, landslides, and industrial incidents are excluded from these statistics, even though their cumulative impact can be crucial for local communities.
Leire Labaka, Josune Hernantes, and Fernando María de Villar Rosety, researchers at Biodiversity and Environment Institute ) at the Tecnun School of Engineering at the University of Navarra, have contributed to an article in the journal International Journal of Data Science and Analytics under the degree scroll “From headlines to databases: leveraging LLMs for structured disaster event extraction,”which uses an artificial intelligence system (large language models) to identify and record local disasters based on news published online. This method makes it possible to detect small-scale events that are typically excluded from large data .
The system combines web scraping techniques and language models such as ChatGPT to automatically analyze news articles and extract relevant information about disasters, such as their location, date, or the infrastructure affected. "Every day, hundreds of news stories describe how a flood cut off a certain road or isolated a certain town. That information exists, but it is scattered and disorganized. Our work language models to systematically read it and convert it into data : the bridge between what newspapers report and what planners and managers need to make decisions,” says Fernando María de Villar Rosety, article researcher .
The researcher team researcher the tool a case study focused on floods that had occurred in Granada. Using just 21 local news articles collected from platforms such as Google News, the system was able to identify affected roads and other impacts associated with the heavy rainfall events, achieving a 76% accuracy rate in extracting relevant information. "It’s not just about collecting data, but about analyzing it and detecting patterns of vulnerability that go unnoticed. Seeing that the A-92 highway appears repeatedly in the news after every storm allows us to identify that it is infrastructure that is systematically failing and requires preventive measures,” adds Leire Labaka.
According to the authors, this system could complement data instructions such as EM-DAT, which only record high-impact disasters and tend to exclude smaller-scale events. "Artificial intelligence allows us to transform thousands of news reports on the impact of various climate events into data . This enables us to detect vulnerabilities and prevent failures in our infrastructure—a core topic step core topic building more resilient cities and improving society’s safety in the face of these events,” says Josune Hernantes.
reference letter
• PuimePedra, M., Elkady, S., Villar-Rosety, F.M. et al. From headlines to databases: leveraging LLMs for structured disaster event extraction. Int J Data Sci Anal 22, 65 (2026). https://doi.org/10.1007/s41060-025-01017-1