Airport comments by @espinielli

Comments 1 to 7 of 7

Picture of espinielli

Typo in encoded runway surface

The runway data dictionary specifies "ASP" for asphalt and "CON" for concrete while currently the CSV data set has

07L/25R ASPH

07R/25L CONC

Picture of espinielli

Typo in encoded surface for runway 12/30

Surface for 12/30 runway in the CSV data set should be encoded as "ASP" instead of current "ASPH" as per data dictionary.

Picture of espinielli

Typo in surface in CSV data set

The surface for Geneva's (LSGG) 04/22 runway is reported as "CONC" while the description for the values in the data dictionary says the value for concrete should be "CON" (concrete)

Picture of espinielli

add IATA code

IATA code for LHPR is QGY, see also wikipedia.

HTH

Picture of espinielli

(no subject)

How did that airframe in the satellite image arrive there?

Picture of espinielli

Fiedl 'ident' in CSV file uses old ICAO (HSSS)

I am reporting that the CSV dump of the airport DB for the 'ident' field still uses the old ICAO, HSSS, instead of the new one, HSSK.

(Probably 'keyword' needs changing too)

My understanding is that the 'ident' field is about ICAO code, from the Data dictionary page (https://ourairports.com/help/data-dictionary.html):

"The text identifier used in the OurAirports URL. This will be the ICAO code if available. Otherwise, it will be a local airport code (if no conflict), or if nothing else is available, an internally-generated code starting with the ISO2 country code, followed by a dash and a four-digit number."

Picture of espinielli

Fiedl 'ident' in CSV file uses old ICAO (SPIM) instead of ne

I am reporting that the CSV dump of the airport DB for the 'ident' field still uses the old ICAO, SPIM, instead of the new one, SPJC.

My understanding is that the 'ident' field is about ICAO code, from the Data dictionary page (https://ourairports.com/help/data-dictionary.html):

"The text identifier used in the OurAirports URL. This will be the ICAO code if available. Otherwise, it will be a local airport code (if no conflict), or if nothing else is available, an internally-generated code starting with the ISO2 country code, followed by a dash and a four-digit number."