OpenStreetMap logo OpenStreetMap

Cascafico's Diary

Recent diary entries

Websites URLs in many cases are not stone steady, hence monitoring their status can be wortly. Here is how to do it in a semi-automatic flavour:
* download POIs,
* ask their URL a reply
* store unresponsive websites OSM objects

Let’s start gathering a list of shops with website tag. This Overpass example query yelds a CSV with essential data separated by comma. You can see the result in Overpass data window.

To automate process (bash), we need the Overpass query string and provide it as an argument of wget command. In “Export”, simply copy the link you find in “raw data directly from Overpass API”, then (remembering to enclose link in double quotes)


$ wget -O mylist.csv "http://overpass-api.de/api/interpreter?data=%5Bout%3Acsvblablablabla"

at this point mylist.csv contains something like:


@id,@type,name,website
194581793,node,Sirene Blu,http://www.sireneblu.it/   
228109189,node,Ecoscaligera,http://www.ecoscaligera.com/   
[ETC, ETC]   

Now we need to scan each line of mylist.csv and wait for an http reply (ie: 200 OK, 300 moved, etc). It’s done running the following code:


#! /bin/bash
while IFS="," read -r OSMid OSMtype OSMname url
do
  REPLY=`curl --silent --head $url | awk '/^HTTP/{print $url}'`
  echo "https://www.openstreetmap.org/"$OSMtype/$OSMid","$REPLY
done < <(tail -n +2 mylist.csv)

Let’s call the above script replies.sh. The output could be something like:


$ ./replies.sh 
https://www.openstreetmap.org/node/287058106,HTTP/1.1 301 Moved Permanently
https://www.openstreetmap.org/node/424738144,HTTP/1.1 301 Moved Permanently
https://www.openstreetmap.org/node/534834927,HTTP/2 301 
https://www.openstreetmap.org/node/766863973,HTTP/1.1 200 OK
[ETC, ETC]

Redirecting to a file, such output can be easily filtered with grep in order to obtain a list of OSM objects whole website tag needs to be updated (to null):


$ ./replies.sh | grep  " 403 " > shops-to-update

tags: linux, bash, URL

Should I import Microsoft building footprints?

Posted by Cascafico on 19 July 2022 in Italian (Italiano). Last updated on 14 August 2022.

Data

Recently Microsoft released global building footprints in ODbL.

Quality

I’ve found some challanging situations:

Test

I made several test and some local imports, taking in account in priority order:

  • not overwrite mapper manual work
  • keep consistent highway network
  • check parallax false footprints
  • simplify wherever possible
  • overall offset if needed (Strava and GPS OSM repo references)

Preveza Case study

The town is a compendium of above quality problems. Some imagery spots are burnt by the sun and consequent low contrast was oddly computed by MS algorithms. It took about 5 hours to adjust footprints (and several highways). I applied the following workflow:

  • data cropped by Preveza landuse=residential
  • objects below 60 square meters filtered out
  • object touching OSM buildings filtered out
  • overlapping polygons manual fix
  • manual rotation adjustments on many object (AI problem)
  • highway adjusted by Stava heatmap

Tools

Qgis for initial conversion, crop and filtering. JOSM for offsets, rotations, ovelapping fixes.

Fixing municipalities without townhall

Posted by Cascafico on 3 March 2021 in Italian (Italiano). Last updated on 14 April 2021.

Intro

I bumped into an interesting overpass query by OSM mappers Musuruane and Albani which underline missing townhalls. How could I use this stuff? Coincidence (and considering that the italian opendata scenario is not particulary exciting) I was surprised to find the national Public Administration addresses dataset released in CC0 :-)

Preparing a manual import

Required:
* Public Administration (PA) offices dataset
* Openrefine tool

  • Create Openrefine project from PA offices dataset. Filter as needed (ie: titolo_resp=”Sindaco” and Regione=Abruzzo)

  • Generate Abruzzo municipalities without townhall with overpass-turbo query;
  • find the link of “Exporting raw data directly from Overpass API”;
  • copy&paste link in Openrefine Create project Web Addresses (URLs);
  • apply cross table funtions as needed (ie: cell.cross(“Indice PA”, “Comune”)[0].cells[“Cap”].value
  • Openrefine custom tabular export the file somewhere online;
  • feed above file to umap remote data with useful tags displyed in popup (layer interaction)

Umap

Here you can find the resulting umap that is being used for add and detail missing municipality townhalls.
Blue layers are from PA (input) dataset. Red layers are on-the-fly OSM queries with missing townhalls.
You can clone it and expolre layers and its remote data used.

Importing

  • In umap select useful layers and zoom in to display markers;
  • find a missing townhall (red ball marker)
  • click matching blue marker (candidate townhall) and copy tags;
  • activate iD editor and paste tags accordingly;
  • toggle blue layer to spot others and repeat.

Notes

Blue marker alone means townhall already in OSM, red marker alone, means no PA data available.

Remember to allow OSM and Overpass-turbo servers to update red layers (generally it takes less than a minute).

Use proprietary imagery just a generic guidance.

Italian schools geocoding

Posted by Cascafico on 1 October 2020 in English. Last updated on 10 October 2020.

Intro

Italian Ministry for Education (MIUR) publishes yearly a nationwide detailed dataset about public schools (isced:level 0 to 3). Guess which OSM-essential data is missing.

No reliable geocoding is feasible if we don’t have a homogeneous address base and I think it’s unlikely large nations (say, Italy, 60 million people is one of these) feature such quality. So we need a tool to filter approximate geocodings (typically street centroid) out from good ones.

Tools

I tried csvgeocode, a light script to insert lat and lon fields in input dataset, but it lacks in grabbing important geocoder responses like accuracy.
then I ran into Openrefine (OR) which has been created to fix “messy data”, but can be happily used to compose geocoders requests, filter responses adequately and structure output in whatever format you want.

Data

  • School buildings dataset, where address is well structured (updated 2018)
  • School details dataset 2020, where all the other stuff is stored (updated 2020)

Buildings dataset (EDIANAGRAFESTA)

Here you’ll find fields for composing geocoding request: in separate columns street type (Via, Viale, Piazza, etc), street name, postalcode, municipality. And of course, the reference field (Codice Meccanografico) needed to link the school details dataset. Using buildings dataset, Openrefine will compose columns to generate Nominatim geocoder requests, fetch URLs responses and filter out those OSM elements which “type” is not punctual (typically highway=* or blank).

School details dataset (SCUANAGRAFESTAT)

Here you’ll find stuff like school name, unique reference code (Codice meccanografico), contacts, description (for ISCED), etc. Openrefine will manage several “messy data” like accents, abbreviations, typos, titlecase etc.

Install Openrefine

OR can be installed in few steps. School datasets feature 55k+ records and OR run nice and easy on my rasperry +1Gbyte RAM board.

Importing Buildings dataset in Openrefine

See full entry

Mass edits are bad

Posted by Cascafico on 18 May 2020 in Italian (Italiano).

problem

A fast undocumented address import caused some concert in Bergamo municipality. One of them was mixed formats housenumbers:

  • upper-lowercase extensions (ie: 3D, 42f)
  • space between (ie: 3 D, 42 F)
  • slash between (ie: 3/D)

fixing

queries:

  • slash between number and extension
  • lowercase address extensions
  • space between number and extension

export:

overpass-turbo allows to export in Level0 text editor (500 elements limit): just cut and paste in your favorite editor, mine are Notepadd++ (Windows) and sed (Linux).

slash removal:

notepad++ regex slash removal
find:
housenumber = ([0-9]+)/([^0-9])
replace:
housenumber = \1\2

Linux sed:
sed -r -e ‘s:housenumber = ([0-9]*)/([a-zA-Z])$:housenumber = \1\2:’ level0.txt

uppercase :

notepad++ find and replace uppercase housenumber:

find: housenumber = ([0-9]+)(\w)$ replace: housenumber = \1\U$2

Form on-line di raccolta dati

Posted by Cascafico on 27 April 2020 in Italian (Italiano). Last updated on 4 May 2020.

Premessa

Fuori dall’emergenza covid-19 c’era sempre qualcuno che occasionalmente o per hobby se ne andava in giro con uno smartphone a mappare fontane, panchine, fermate dell’autobus… poi, da quando l’emergenza ci ha vincolato nelle nostra case, anche l’attività dei mappatori ha subito un rallentamento o meglio, un “riorientamento”: si è passati a ciò che chiamiamo “armchair mapping”, ovvero ci si è seduti davanti al computer per ricalcare da foto aeree strade mancanti, linee di costa, edifici ecc. Si poteva comunque fare qualcosa di più?

Elenchi al tempo del covid

Nel frattempo, già nei primi giorni del lockdown cominciavano a spuntare in rete, sui portali di molti comuni italiani, elenchi di attività più o meno essenziali che offrivano servizi di asporto e consegna a domicilio. Questi elenchi erano ovviamente stati stesi per una consultazione ed alla utenza limitata del piccolo-medio comune o del quartiere; la pubblica amministrazione quindi non aveva certo curato la struttura dei dati ne’ una standardizzazione, anche perchè auspicabilmente tutti consideravamo questa emergenza come transitoria. Non sono mancate comunque imprese della IT che hanno organizzato raccolte di dati attraverso i social-media, provando a mettere giù qualcosa di più strutturato; questi attori, creando siti ad-hoc, hanno innescato un circolo virtuoso tra diversi social-media per raccogliere dati nei propri database; il che, al di là di essere dichiarato un servizio non a scopo di lucro, offre loro l’opportunità di apporre il diritto di proprietà sul “raccolto”.

Proposta

See full entry

GeoPaparazzi, a survey tool for OSM (too)

Posted by Cascafico on 10 December 2019 in Italian (Italiano). Last updated on 12 December 2019.

Intro

When I first came across GP, I appreciated the fast and simple interface that allows to spare menu navigation time in favour of environment observations. Besides, in GP very few things are predefined, allowing you to customize background maps and forms which are the main features for an efficent survey. What is apparently missing is a compatibility with Openstreetmap: some way to easily get osm-xml format files suitable to be piped to JOSM editor. Of course GP is not an OSM integrated editor such as Vespucci, OSMcontributor, etc. which require stable data connection, but its strength is in flexibility, quickness and simplicity in data gathering.

Form presets

GP forms help to provide a GUI with human readable input and structured output. In order to show you the different recording possibilities, GP comes with predefined forms; these forms define mainly text notes, but also photo and sketch notes Forms are defined in tags.json file (file manager, built-in storage > geopaparazzi folder). The simple trick to have an OSM ready survey tool is customizing tag.json. Starting reading the default one could give you an idea about structure to be used for obtaining OSM xml’s.

OSM customization

See full entry

I was trying to geocode addresses via Nominatim, using the OpenRefine function “fetch from URL”, but I got empty strings. I re-run it, checking “store error box” and I got the problem: certificate was not recognized or missing. A little confusion: why browser happly donwloaded geocoding rresponses, while OR didn’t?

OR rely on Java funcions to do the job, so I had to fix certificate issue in the following way:

  • get certificate:
    $ openssl x509 -in <(openssl s_client -connect nominatim.openstreetmap.org:443 -prexit 2>/dev/null) -out /tmp/nominatim.crt

  • store it:
    $ keytool -import -file /tmp/nominatim.crt -keystore cacerts -storepass changeit

  • If keytools complains about finding cacert:
    $ find / -name cacerts -print
    /usr/lib/jvm/jdk-8-oracle-arm32-vfp-hflt/jre/lib/security/cacerts
    (in my raspbian Java installation)

Monitor OSM changes

Posted by Cascafico on 14 December 2018 in Italian (Italiano). Last updated on 19 March 2019.

about

Mountain hiking association asked me to monitor local hiking relations from edits done by other mappers. I wrote this script which generates an html page with OSM changeset involving daily changes. It is feeded by overpass-turbo query, like this, so you can customize which OSM objects to monitor.

steps

The mentioned monitor script will:

  • extract relations modified in the last 24h and tagged operator=”CAI” in area provided as argument
  • filter useful informations
  • format an html file with OSM, Achavi and OSMcha links

automate runs

Of course it can be run periodically using crontab. To set a daily run, just execute
$ crontab -e
add a line like below, customizing your script path:
0 0 * * * /home/pi/scripts/monitor.sh  » /dev/null 2>&1

telegram

My instance include a telegram service fired when a changeset list is written. I use telegram-cli with the following line in script:
echo “msg $CHANNEL $MESSAGE”; echo “safe_quit” | /home/pi/apps/tg/bin/telegram-cli -W

Just add telegram CHANNEL and MESSAGE variables and insert the line inside $CHAN==0 else if

TODO

Fix area names which include spaces, ie “Cividale del Friuli”

Hiking relations in wiki table

Posted by Cascafico on 10 December 2018 in Italian (Italiano).

Intro

When adding several hiking routes, sometime it comes handy having updated tables in wiki format to be easily added in specific wiki pages.

Recipe

  • overpass query on area name
  • some awk insertions
  • pyhon script csv2wiki

overpass-query

This overpass query will generate a csv file(ie: stregna.csv) with local network hiking paths. Example output:

@id,name,ref,network
2071931,,747,lwn
6934010,Sentiero Italia - Tappa A14,SI,lwn
7112799,Sentiero Hum,,lwn

This awk will add OSM link to each relation:
$ awk -F “,” ‘FNR == 1 {print $0} FNR > 1 { print “[osm.org/relation/”$1” “$1”],”$2”,”$3”,”$4 }’ stregna.csv > stregna.txt

actual wiki formatting

Python script will generate the actual wiki text:
$ python csv2wiki stregna.txt > stregna.wiki

summarizing script

This script performs the steps above (here you can customize query with whatever relation tags). Just provide an OSM area name.

Dirty datasets

Posted by Cascafico on 14 November 2018 in Italian (Italiano). Last updated on 30 September 2020.

Mini abstract

I’ve found a 600+ rows Bed&Breakfast dataset, available opendata by RAFVG. No geo coordinates. Since hiousenumbers were recently imported from RAFVG dataset, I decided to go for geocoding. To get reliable coordinates I used csvgeocode script attached to nominatim geocoding service. Nominatim requires almost perfect (standardized) odonyms, hence I started openrefine and a reconcile service which comes in a separate jar. Reconcile service needs a csv with authoritative names, which I get from overpass-turbo and some filtering.

Dataset

Bed and Breakfast is a rather new dataset (Oct 17) with more than 600 POIs. Many useful fields such as

  • name and operator
  • phone
  • email
  • site
  • opening hours
  • category (standard, comfort, superior)

Cleaning data

Such duty has been accomplished by OpenRefine and Reconcilie plugin, connected as a reconciliation service,

In order to standardize messy B&B addresses (entered by B&B operators theirselves) I had to provide Reconcile with an authoritative set of highway names, which I got from overpass-turbo (see Strade d’Italia diary entry).

Geocoder

Just happened for other projects, I choose csvgecode which features pretty simple usage.

Here is a run using mapbox service:

$ csvgeocode input.csv output.csv –handler mapbox –delay 1000 –verbose –url “http://api.tiles.mapbox.com/v4/geocode/mapbox.places/{{INDIRIZZO}},{{CAP}} {{COMUNE}}.json?access_token="

here using nominatim, instead:

$ csvgeocode input.csv output.csv –handler osm –delay 1000 –verbose –url “http://nominatim.openstreetmap.org/search?q={{INDIRIZZO 1}}, {{COMUNE}}&format=json” Rows geocoded: 468
Rows failed: 114
Time elapsed: 879.4 seconds

See full entry

CAP (postcodes) of Italy

Posted by Cascafico on 29 August 2018 in English. Last updated on 30 September 2020.

Abstract

Codici di Avviamento Postale (postcodes) are propriety of Poste Italiane S.p.A. Sometimes OSM mappers who love the completeness of data cannot get this codes from open sources.
The below procedure is a simple workaround, taking advantage of iPA (Public Administration index) OpenData which lists widespread public offices in Italy.

Getting data

Go to iPA site and download “Amministrazioni” file. Here you can read when it was last updated.

Filtering data

On downloaded file, run:
$> awk –field-separator=”\t” ‘{ print $3”,”$6 }’ amministrazioni.txt | sort -u > CAP.csv
The command above takes TAB-separated columns 3 and 6 (municipality and postcode), sorting uniques and writes a comma-separated file.

Alternatively, you can switch columns for postcode sorting:
$> awk –field-separator=”\t” ‘{ print $6”,”$3 }’ amministrazioni.txt | sort -u > CAP.csv

Multi postcode municipalities:

Some cities have more than one postcode: it could be someway useful listing public office addresses (column 10) to approximately identify the area:
$> awk –field-separator=”\t” ‘{ print $3”,”$6”,”$10 }’ amministrazioni.txt | sort -u > CAP.csv

Take a look

To count extracted postcodes, simply:
$> wc -l CAP.csv
8638 cap.csv

For an overview of firsts and lasts codes:
$> head -3 CAP.csv && echo “…” && tail -3 CAP.csv
Abano Terme,35031
Abbadia Cerreto,26834
Abbadia Lariana,23821

Zumpano,87040
Zungoli,83030
Zungri,89867

Download

Please, find in CSVs repo processed files.

Strade d'Italia

Posted by Cascafico on 21 June 2018 in Italian (Italiano). Last updated on 25 March 2019.

Qualcuno in ML si era chiesto come rappresentare la diffusione dei vari odonimi nazionali. Il metodo grezzo sarebbe scaricare qualche Gbyte di pbf da Geofabrik ed utilizzare osmconvert e osmfilter. Ma per perché trasferire inutilmente il 99% dei dati? Facciamo invece lavorare overpass-turbo per noi :-)

I file che troverete citati di seguito sono disponibili in github come pure lo script bash ItalyNames.sh che li genera.

Codice ISTAT

E’ noto che ogni comune italiano non può avere lo stesso odonimo per strade diverse, per cui la query se eseguita per ogni comune, ci permetterà di isolare solo la prima occorrenza del nome. Come individuare univocamente un comune? Il codice ISTAT assegnato al tag ref:ISTAT ci viene in aiuto. Facciamoci un’idea di quanti comuni abbiamo e qual’è il loro codice: query codici ISTAT comuni italiani

Sintetizzo l’output del file municipality_codes (7955 linee) con le prime e le ultime:
001001
001002
001003

111105
111106
111107

Adesso sappiamo che è numerico, composto da 6 cifre e che l’Italia ne ha oltre 7955; internet ci viene in aiuto rivelandoci che le prime 3 sono il codice provincia. Se poi ci viene la cursiosità di sapere quante sono, sul municipality_codes ottenuto sopra eseguimo un:
pi@raspberrypi:~ cat municipality_codes | cut -c 1-3 | sort -u | wc -l
109

Per i nostri scopi creiamo anche la lista delle provincie province_codes:
pi@raspberrypi:~ cat municipality_codes | cut -c 1-3 | sort -u > province_codes

Con municipality_codes potremmo già eseguire in loop una query overpass che estrarrebbe tutto, ma sia per limitare le richieste al server, sia per rendere futuri aggiornamenti dei nomi più gestibili, rompiamo ulteriormente municipality_codes generando un file per ogni provincia con la lista dei codici ISTAT dei suoi comuni; otteniamo per esempio il file “100”: pi@raspberrypi:~ cat 100
100002
100004
100003
100006
100001
100007
100005

Script di query overpass-turbo

See full entry

Il comune di Milano ha pubblicato un dataset delle piste ciclabili. Facciamo una umap per capire la situazione rispetto ad osm.

  • facciamo il login con le credenziali OSM
  • selezioniamo “creare una mappa”
  • importiamo il dataset (nel nostro caso geojson) con il bottone e dx “importa dati”; se il nostro file non ha una estensione esplicita, dobbiamo informare umap di cosa si tratta con “scegli il formato dati”; se va tutto bene ci troviamo il layer delle ciclabili nel colore di default (blu).
  • otteniamo i dati al volo tramite overpass-turbo: questa è la query per Milano che estrae le ciclabili da OSM, per altri comuni basta sostituire la geocodeArea in alto.
  • su overpass-turbo salviamo da qualche parte la stringa che genera i dati della query: export>query, clicchiamo sul “compact” e copiamo il link
  • per importare i dati della query in umap è necessario creare un nuovo layer ed aprire il form a destra su “dati remoti”; qui incolleremo la stringa appena copiata; scegliamo il formato dati “osm”; mettiamo in ON le opzioni “dinamico” e “richiesta proxy”
  • salviamo la mappa e dopo alcuni secondi (dipende da traffico su overpass-turbo) dovremmo vedere anche il layer dei dati estratti da OSM.

Questa è la procedura di base, dopicichè si tratta di aggiustare lo sfondo (ho scelto il monocromatico), i colori dei layer e le eventuali etichette che appaiono passando il mouse sopra ed al click.

Elevations mass editing

Posted by Cascafico on 13 March 2018 in Italian (Italiano). Last updated on 14 March 2018.

Update: comments pointed out that such edits are against the code of conduct, even if not automated. I leave this post just as personal hint on queries and regexp’s.


It happens there’s a lot of OSM tags “ele”=” m" that can't be processed by useful queries like [find volcanoes between 1000 and 2500 meters high](http://overpass-turbo.eu/s/wXw)

How I remove unit of measure:

  • download faulty data with this generic overpass-turbo query
  • from export menu, “download/copy as raw OSM data”

  • load resulting .osm file in Notepad++ (or whatever regexp editor)
  • find&replace (CTRL+h) twice with the following fields:
    find: k=”ele” v=”(\d+(.\d+)) m” will find for example “450.5 m”
    replace: k=”ele” v=”$1” will replace “450.5”
    find: k=”ele” v=”(\d+) m” will find for example “450 m”
    replace: k=”ele” v=”$1” will replace “450”
  • load in JOSM modified .osm file
  • select all (CTRL+a) and add a fake tag (ie: pippo=pluto) to every node and remove it (just a shortcut to add “modify” tag to elements, otherwise you won’t be able to upload anything

Remember to add proper changeset comment.

Civici da Telegram

Posted by Cascafico on 22 February 2018 in Italian (Italiano). Last updated on 23 February 2018.

Piersoft ha formito un prototipo a “Roma Capitale” per la rilevazione di punti di interesse (PDI) via bot Telegram.

La umap contiene in basso a sinistra il collegamento per scaricare i dati raccolti in formato csv.

I dati sono di pubblico dominio (CC0); tra varie categorie di PDI ci sono anche i numeri civici; i campi oltre al numero inserito dal rilevatore, contengono la data del sopralluogo ed il geocoding (via Nominatim) della strada; per importarli in Openstreetmap è necessario convertirli per esempio con csv2osm. Nel caso volessimo importare i civici in OSM avendo cura di:

  1. filtrare il .csv per “categoria”=”Civico” perl -i -ne ‘print if /,Civico/’ map_data.txt

  2. aggiungere lo header del .csv modificato, lasciando solo i campi utli del .csv:
    lat, lon,,,survey:date,,,source,,,,description,addr:housenumber

  3. adattare i valid_tags del .py
    valid_tags = [‘addr:housenumber’, ‘survey:date’, ‘source’, ‘description’]

  4. eseguire la conversione: python csv2osm.py map_data.txt > civici.osm

Importato in JOSM il file civici.osm, eliminare errori di battitura o civici non validi, cercandoli (CTRL+F) con la stringa:

“addr:housenumber”=[0-9]*[a-z] or “addr:housenumber”=[0-9]*

avendo cura di abilitare “espressione regolare” tra i checkbox a sinistra.

In alternativa, per evitare questo lavoro di editing, credo sia possibile installare in JOSM il plugin per importare i file csv, ma la mia release è piuttosto vecchiotta e non accetta funzioni aggiuntive.

Ovviamente resta il problema della “conflation”, ovvero di importare solo i civici che non sono stati già mappati, nonchè addr:streetname.

Routing QA [eng]

Posted by Cascafico on 29 November 2017 in Italian (Italiano).

Mini abstract

Quoting OSM wiki about routing

“Checking your fix. After you have fixed an error on the map you will need to wait until the revised version of the map propagates into the routing engine you are using. This delay will depend for each engine…”,

…one of the problems in Qualty Assurance (QA) is the lag between OSM editing and update of routing databases. As far as I know, implementation takes some hours in Grasshopper and 24+ in Mapzen and OSMRM.

Hence, in trying to solve a routing problen (ie: for an interchange) testing can only be done well beyond editing time :-(

Let’s see how we can bypass this delay using an homemade router.

Choosing a router

Routino looks simple, functional and flexible. I picked out this SW because it works flawlessly even on a small device like my OrangeePi: as an example, you can route in Friuli Venezia Giulia with this live instance, updated daily. For larger areas we’d soon experience performance problems; so let’s leave global routinig up to above mentioned services and focus on a kind of patchwork on-demand…

Automating patch generation

User should be able to choose the area to debug: to achieve this (and due to my zero experience in html/perl scripts) I create a Telegram bot named Routino Patcher and a bash script to manage content.

What happens is:

See full entry

Location: Baldasseria Media, Udine, Friuli-Venezia Giulia, 33100, Italia

Routing QA

Posted by Cascafico on 3 October 2017 in Italian (Italiano). Last updated on 19 October 2017.

Mini abstract

Come riportato nella wiki sul routing. “Checking your fix. After you have fixed an error on the map you will need to wait until the revised version of the map propagates into the routing engine you are using. This delay will depend for each engine…”, uno dei problemi della Qualty Assurance (QA) è il tempo che intercorre tra la modifica OSM e l’integrazione nei database di routing. Per quel che ho visto, Grasshopper è più rapido ad aggiornarsi di Mapzen ed OSMRM, ma si tratta sempre di alcune ore per il primo e più di un giorno per gli altri.

Se trovo un problema di routing ad uno svincolo e cerco di risolverlo, il riscontro va ben oltre il tempo di editing, Vediamo quindi se possiamo ovviare a questo ritardo, in modo casalingo…

Individuiamo un router

Routino sembra essere semplice, funzionale e flessibile. La mia scelta è caduta su questo SW, perchè funziona agilmente anche sul mio OrangePi, con il quale posso fare routing sui dati per il Friuli Venezia Giulia. Per aree più estese si incontrerebbero presto i limiti hardware di questa macchinetta, per cui lasciamo il routing mondiale ai servizi sopraccitati e dedichiamoci a come generare al volo una sorta di patchwork on-demand…

Automatizziamo la compilazione

L’utilizzatore deve poter scegliere la zona geografica che vuole controllare; all’uopo (non avendo conoscenze di programmazione html5) ho creato un bot Telegram che ho chiamato RoutinoBot ed un bash script che ne gestisce i dati ottenuti. Quello che succede è:

See full entry

OSM Analytic Tracker (OSMAT)

Posted by Cascafico on 18 September 2017 in Italian (Italiano). Last updated on 25 March 2019.

Mini-abstract

Tracking vandalisms is often not an easy task. To accomplish this activity I hardly find useful digging in the history of OSM changesets.

Instead OSMAT features a clean list of create/modifiy/deletes and tag changes; the latter one is particularly useful to track “advanced vandalism”.

OSMAT focuses on fast detection, so don’t expect to configure it on a national base, since rows will start to scroll too fast for spotting problems. For the same reason, OSMAT relies on minimal base of mappers and (quoting its wiki) “Improve[s] team spirit of a regional OSM team/task-force”.

The following installation procedure has been brought live on a OrangePi PC single-board computer. Here you can find a regional live instance for Friuli Venezia Giulia Italy.

Docker installation

OSMAT deployment is Docker based, so refer to Davide’s Docker installation
UPDATE: as from 2019-03-20 installation command quoted at the above link doesn’t work anymore for my debian jessie armhf. Please, use the following:
$ sudo curl -sSL https://github.com/moby/moby/raw/v1.13.1/hack/install.sh | sh

OSMAT download

Then download OSMAT sources:
$ git clone https://github.com/MichaelVL/osm-analytic-tracker.git

Check docker service

$ sudo service docker status
● docker.service - Docker Application Container Engine
Loaded: loaded (/lib/systemd/system/docker.service; enabled)
Active: active (running) since Sat 2017-09-16 11:40:37 CEST; 2 days ago

OSMAT configure

Once you got docker up and running (I had to upgrade linux kernel > 3.10 for my Jessie Orange Pi) and before compiling container, you have to configure active region.

Default available regions are the ones defined by Geofabrik. For instance, navigate to nord-est, take note of .poly link (nowadays is https://download.geofabrik.de/europe/italy/nord-est.poly) and edit configuration file:

See full entry

Controllare una rete di sentieri

Posted by Cascafico on 16 June 2017 in Italian (Italiano). Last updated on 19 June 2017.

A seguito dell’accordo di collaborazione tra Club Alpino Italiano e Wikimedia Italia, si cercherà di fare un po’ di ordine nella rete delle relazioni sentieri CAI in modo che possa anche essere facilmente aggiornata. All’uopo è stata creata la pagina wiki OSM CAI che elenca nel dettaglio gli attributi che gli elementi escursionistici devono/possono avere.

Quello che segue è il metodo che sto applicando per il controllo delle relazioni CAI, applicato al settore 7 del Friuli Venezia Giulia. Lo strumento principale usato è overpass-turbo. L’identificativo numerico “@id” della relazione ci servirà per isolare facilmente tutti gli elementi che compongono il sentiero.

Situazione sopralluoghi

Intanto un’occhiata all’area geografica coinvolta: inquadriamo l’area che contiene i nostri sentieri ed eseguiamo questa query che evidenzia su mappa le relazioni del settore 7XX operate dal CAI che sono state etichettate con sopralluogo (survey:date=2017).

Poi ne generiamo
una lista:
@id @type name ref 1867679 relation 725 725
1867680 relation 749
[…]
7272887 relation 752
7276582 relation 746 746

Analogamente generiamo la lista delle relazioni senza survey:
@id @type name ref
126402 relation 727 727
1121382 relation 761a 761a
[…]
141593 relation 702
380612 relation sentiero C.A.I. 727 727
419017 relation 727A 727A

Non c’è molta uniformità nell’uso di “name” e “ref”, ma possiamo tralasciare.

Controlli OSM

Prendiamo per esempio la prima relazione senza sopralluogo del sentiero 702 con @id=141593 e controlliamola con due utili servizi online:
- OSM Relation Analyzer per vedere eventuali interruzioni del percorso, composizione dei tratti ecc.
- Waymarked trails per vedere la simbologia utilizzata, il profilo altimetrico ed una sitesi delle etichette.

See full entry

Location: Madonna delle Grazie, Fornalis, Cividale del Friuli, Udine, Friuli-Venezia Giulia, 33043, Italia