Saturday, July 8, 2017

Let's Make a GeoPackage Extensions Registry

One of the best ideas I have heard recently is that there needs to be a registry for proprietary GeoPackage extensions. While OGC has a number of registered extensions as part of the encoding standard, is it only natural that there are going to be proprietary ones out there. What we don't have today is a way for developers to see what exists as prior art. We then run the risk of developers reinventing the wheel and the longer this persists, the harder it is to deal with. The last thing we want is multiple competing extensions out there and no interoperability between them.

I believe an extensions registry will address this problem effectively. I will do the legwork for creating it. In the short term it is going to be pretty simple - just a web page off of with a bunch of hyperlinks. I will use the Leaflet Plugins page as inspiration; hopefully one day we will end up more like QGIS Plugins but it won't happen overnight.
Update: this is now live at

To be added to the registry, I will need from the developer a completed Extension Template [1]. This template includes all of the information developers will need to understand what the extension does and how it works. I will post the template to (or more likely, link to it) and help publicize it. If the extension suits other people's needs then great! If a number of implementers start using a particular extension, let the GeoPackage SWG know and we will discuss whether it is ready to be adopted by OGC.

[1] The raw AsciiDoc of the template is on GitHub. If filling out a template is going to be a major problem, let me know and I'll see what can be done to help you out.

Saturday, July 1, 2017

The Styling / Portrayal Ad Hoc Meeting

On Thursday June 29, I facilitated a Styling / Portrayal Ad Hoc at the OGC Technical Committee Meeting. This ad hoc meeting was called because there is broad agreement that GeoPackage would benefit from standardized feature styling and portrayal capabilities. (Anecdotal evidence suggests that the lack of common/standardized solutions here is already inhibiting GeoPackage adoption.) However, there is also agreement in OGC circles that whatever capabilities used by GeoPackage should apply across all of OGC Simple Features. The meeting was attended by over 40 people (combined in-person and on-line[1]).

Much of the meeting centered around Styled Layer Descriptor (SLD) and Symbology Encoding (SE), two OGC encoding standards[2]. The consensus was that while these are annoying (XML!) formats to work with and that they have a number of issues, they are not that far off for meeting most needs. (The geospatial community has created a number of specifications over the years but none of the alternatives have been standardized and many of them have been abandoned.) The attendees tentatively agreed that attempting to move SLD/SE forward was the best available option for supporting the desired capabilities in a timely manner.

The attendees agreed on the following action items:
  Rechartering the SLD/SE Standards Working Group (SWG) with the goal of producing updated standards that feature a core and extensions model and/or multiple conformance levels so that there is some separation between essential and non-essential elements
  Finding someone to create an executable test suite, developer’s guide, and other materials to lower the bar for developer use
  Initiating a pilot to experiment with the SLD/SE changes
My own next step is to take these findings back to OGC’s Architecture Board and try to turn them into action.

[1] Don't get me started on the pathetic state of hybrid in-person / on-line meetings in 2017.
[2] There was also discussion of portrayal registries, but that is a topic for another day.

Wednesday, May 3, 2017

Good News / Bad News

First the good news. OGC's Architecture Board (OAB) has approved our request to send GeoPackage 1.2 to the Technical Committee (TC) for an electronic adoption vote. This vote will start in a couple of weeks and there is a high likelihood that this version will be fully adopted by OGC by the end of June. 

The bad news is that the Elevation Extension has been delayed. The OAB requested that the extension be removed from version 1.2 so that it can be worked on separately. The feeling was that elevation data was too important to be rushed and that more time was needed to align it to other parts of the OGC baseline including things called "Geographic information — Schema for coverage geometry and functions"[1] and "Coverage Implementation Schema"[2].

I know this decision will cause uncertainty in the GeoPackage community. The extension does what it does and there is nothing keeping people from using it as is right now. Your mileage may vary. We should probably give it an alias (something without a gpkg_ prefix) to distinguish it from other adopted extensions. Other than that, we will have to wait for the process to play out. I made sure that someone was committed to carrying the extension to adoption.

[1] This document is dual published as OGC Abstract Specification Topic 6 (Schema for coverage geometry and functions) and ISO 19123 (Geographic information — Schema for coverage geometry and functions).
[2] This document used to be called GMLCOV but it was renamed when version 1.0.1 was adopted.

Wednesday, March 29, 2017

GeoPackage vs. Extended GeoPackage

One constant area of confusion is the notion of "GeoPackage" vs. "Extended GeoPackage". The idea is that a file using just core elements would be a GeoPackage and that one including extensions would be an Extended GeoPackage. However, this has caused problems in practice and we have had to adapt. A couple of months ago I wrote about non-spatial tables. Recently we got rid of the clause that allowed Extended GeoPackages to use the .gpkx extension (no one was). It turns out that this wasn't enough. 

We have been getting some push-back on the WKT for Coordinate Reference Systems extension because it adds a column to the gpkg_spatial_ref_sys table. While we feel that we were justified in this decision, it does affect some design decisions. For one thing, Object Relational Mappings are more complicated when a column might or might not exist. However, we have gotten feedback from other developers that adding columns to tables as part of extensions is a reasonable and necessary technique. What to do?

I propose to tweak a few requirements to get past this. To summarize, the GeoPackage designation acts as a sort of compatibility mode. At the expense of extensions, you get maximum interoperability. If you produce a GeoPackage with extensions (by definition an Extended GeoPackage) then your risk of interoperability issues increases (though hopefully is still small). We hope that applications and libraries will grow to deal with extensions gracefully but in the meantime, avoiding unnecessary extensions does provide the greatest interoperability. 

Does this work for you? Let me know here, on the mailing list, or on Twitter.

Tuesday, February 28, 2017

Filling a Gap: Conformance Tests

As mentioned previously, the GeoPackage 1.2 open comment period is under way. While we were preparing for this, we were briefed on testing performed by a consultant to a high-profile US Government organization. This testing called GeoPackage interoperability into question. Our analysis indicated that these interoperability concerns were due to either a lack of understanding of GeoPackage scope or the non-compliance of the data used in the tests. This is partially our fault - we had not completed the executable conformance tests needed to evaluate GeoPackage compliance. We have since found that some of the sample data posted to (and used in these tests) is not even fully compliant.
We are trying to fix this. As of last month, the OGC-sponsored conformance tests for GeoPackage (available here) only contained tests for the core and tiles portion of version 1.0 of the standard. This month we have built tests for the features portion and the elevation extension and brought the whole thing up-to-date to version 1.2. In the process of building these tests, we have identified a number of (hopefully) minor issues that we will resolve during the required comment resolution period.

Following are our next steps:
  • Deploy the executable tests to the OGC testing site so that they are accessible
  • Update the structure of so that it displays multiple versions of the standard (1.0.2, 1.1.0, and the proposed 1.2.0)
  • Resolve the open issues generated during the comment period
  • Refresh all of the sample data posted on, ensuring that all data passes the conformance tests 
I ask for your patience as we work through these tasks. This is taking a long time, but we are committed to getting it right.

Tuesday, February 7, 2017

Preparing for GeoPackage 1.2

The GeoPackage SWG continues to make changes to the GeoPackage Encoding Standard. Our goal is to make the standard clear, concise, and self-consistent. Following on 1.0.1 (which focused on features) and 1.1.0 (which focused on tiles), we have made a number of changes focusing on extensions. As always, we insist on maintaining reverse compatibility. In fact there are no substantive changes to existing parts of the core this time around.

We plan to release this version as GeoPackage 1.2. For detailed information on this release, please review the release notes. The changes range from typographical fixes to substantive changes that alter requirements. Following is a summary of the substantive changes:
  • Adding an "Attributes" section to describe the use of non-spatial data
  • Deprecating Requirement 69 (a mandate for extension F.1 that was nearly impossible to achieve or verify)
  • Deprecating Annexes F.2, F.4, and F.5, three extensions that were determined to be non-interoperable and non-usable
  • Updating the column name for WKT for Coordinate Reference Systems (Annex F.10)
  • Adding the Elevation Extension as Annex F.11
  • Changing the way GeoPackage versions are declared based on community feedback 
We initiated a 30-day comment period on Friday. After the comment period concludes, we will address all of the feedback. After that we will request a vote by the OGC Technical Committee to adopt 1.2 as an official encoding standard. We hope that you will take this opportunity to check out the draft and let us know if there is anything that can be improved.

Wednesday, January 4, 2017

Fun with Interfaces and Pointers in Go

Now that I am doing server-side programming again, I have fallen in love with the Go language. Long story short, it provides just about everything I need to produce quality server-side code and deliberately withholds language features that have unintended consequences in code quality and maintenance. I believe this has enabled me to produce higher quality code faster than with other languages like Java.

However, it has not always been smooth sailing. I recently hit an unexpected bump in what I thought was fairly simple code. As it turned out, I had not fully grasped how Go handles interfaces and pointers. Allow me to explain the scenario, and in so doing, illustrate how Go interfaces and pointers work. (I only present the minimal code needed to get the point across. Anything extraneous to the point is omitted.)
  1. I wanted to have a custom error struct that had some state information in it. This was done by creating a struct called Error that implements the error interface's one required function, Error().
    type Error struct {
    Message string

    func (e Error) Error() string {
    return e.Message
  2. I wanted Error to lazily initialize its state. For this to work properly (i.e., retain that state after multiple calls to Error()), I needed to use a pointer receiver in my Error() function. Note the difference between the code below and the code above.
    type Error struct {
    Message string

    func (e *Error) Error() string {
    if e.Message == "" {
    e.Message = time.Now().String()
    return e.Message
  3. For functions like foo() that are only ever going to return my own struct, I figured that I could return *Error so that I would have full access to its members without any type checking. (As you will see later, this turned out to be a bad idea!)
    func foo() *Error {
    return nil
  4. Then I could subsequently return the results (an error or nil) to a higher function.
    func bar() error {
    return foo()
  5. Finally I would be able to call my bar function from high level code and check the return value.
    e := bar()
    if e == nil {
    fmt.Println("Hello, playground")
    } else {
    fmt.Println("Message:" + e.Error())

There is just one problem – this code segfaults. Try it out here.

Why this happens is probably not obvious to most inexperienced Go programmers. What is it that implements the error interface? Not Error, but rather *Error. The problem is that bar() must return a) something that implements the error interface or b) nil. Well, *Error implements that interface. As written here (without all of the other logic that makes the function meaningful) foo() returns a pointer to nil. A pointer to nil is not the same as nil. Repeat that to yourself, more than once if you have to.

So what have I done? I created a struct, used a pointer receiver to implement an interface, and called a function that returns a pointer to my struct, and returned that pointer (that just so happens to point to nil). Well crap, since nil is not the same as a pointer to nil the equality expression fails and we try to dereference a nil pointer. Kaboom! The fix is simple – change foo() to return error instead of *Error. Done! (Oh well, my assertion in #3 is wrong and so I will just have to manually check to see if my error happens to be a *Error.)

The moral of the story is that if your struct returns an error, return an error and not something else that implements the error interface. However, sometimes the journey is more important than the destination. You can't fully understand the language if you don't truly understand how the language handles interfaces and pointers.