Home‎ > ‎

Blog


This is my personal blog. I'm not obsessive about blogging. They're more like 'random sightings'...
 

 

Recent posts

Showing posts 1 - 10 of 15. View more »

Standards dramatically advance streamflow and flood forecasting in US and elsewhere – Part 2 of 5

posted Feb 10, 2016, 2:02 PM by David Arctur

In my previous segment, I introduced the U.S. Open Water Data Initiative (OWDI) and the National Flood Interoperability Experiment (NFIE). This part of my water data series is about the NFIE project in 2014-2015. 

See this post on my OGC blog

Science Helping the Data to Help the Science

posted Jan 23, 2016, 7:45 PM by David Arctur

Much of the data collected over decades of academic geosciences research can become like so many of the life forms that ever existed in Earth’s deep history — lost to time except for the scant fossil record now requiring forensics to understand. No Earth system is in isolation, so understanding of this data also depends on context — the relationships between data granules and data sets, sometimes across institutional and disciplinary boundaries. This presentation is about recent advancements in information models and linked data that are enabling geoscientists to better connect their subjects of research into the emerging web of knowledge. This, in turn, is enabling us to ask better and more kinds of science questions. 

Learn more, and download this presentation (Microsoft Powerpoint pptx, 24mb) from the EarthCube host page.   

Standards dramatically advance streamflow and flooding forecasting in US and elsewhere – first in a 5-part blog series

posted Jan 21, 2016, 7:33 AM by David Arctur   [ updated Feb 10, 2016, 1:57 PM ]

This is a story about how water data standards, computational hard work, high-performance computing, serendipity and synergy led to an operational capability for nationwide forecasting of streamflow and flooding at high-resolution, in near-real-time. This has been evolving for several years now, but has gone into hyper-drive in just the last couple years. 

See this post on my OGC blog

Bridging across bridges: engaging the geoscience research community in standards development (Part 3 of 3)

posted Jun 22, 2015, 8:12 PM by David Arctur

(Originally posted on OGC blog Tue, 2015-01-13 16:28)

At the end of Part 2, I said I’d propose a way to leverage EarthCube to loosely-couple the NSF research agenda with the key IT standards development agendas in this segment. Let’s take another look at EarthCube, starting with the last paragraph from Part 2 of this blog, repeated here so you don’t have to flip back just for this bit:

Enter EarthCube:  The US National Science Foundation (NSF) EarthCube program is a long-term initiative to identify and nurture opportunities to make geoscience data, models, workflows, visualization, and decision support available and usable across all geoscience domains.  This is an ambitious undertaking.  Lucky for me, Anna Kelbert just published an excellent overview of the motivation and emerging structure for EarthCube, so I don’t have to repeat all that here. I’ll just say there are now about 30 funded projects in varying stages of completion, and more on the way.  These are in 3 categories: Research Coordination Networks (outreach to potential users), Conceptual Designs (core architecture frameworks), and Building Blocks (technology components). It is intended to be community driven and community governed.

Full disclosure: I’m working in multiple EarthCube projects now, and am on the Leadership Council. I’ve been active in EarthCube since 2011, and in the OGC since 1996 except for a couple years here and there; see my OGC staff bio for a summary.  So I’m pretty engaged and committed to both EarthCube and the OGC, with the goal of improving access and use of data and models across geoscience domains. I’ve also participated in ESIP discussions, and am starting to engage with RDA. I feel that the future for standards development to support the geosciences will involve all four of these consortia: EarthCube, OGC, ESIP, and RDA. 

Here’s how it could happen

Various details of EarthCube demonstration governance are still being worked out, but it has a Leadership Council, a Science Committee and a Technology and Architecture Committee. These committees are creating Working Groups to address both crosscutting and project-specific issues. For example, so far we are forming working groups for Use Cases, Testbeds, Gap Analysis, and Standards, among others.  The Standards Working Group will, among other tasks, provide guidance about science and IT standards that are relevant for use across the EarthCube framework. Just how this will work is in early stages.

My hope for the Standards WG is that they will also consider and act on ways in which current standards could be improved to address the science questions and use cases being studied in the context of EarthCube funded projects. These considerations could come from the work of the Testbeds, Gap Analysis, and other WGs.  But the Standards WG would be the most likely place in EarthCube to then consider which standards development organization (SDO) would be the best suited to follow up with: essentially, whichever SDO is the maintenance authority of the standards to be improved.  The Standards WG would then decide how best to coordinate with this SDO, and take initial steps to do that.

Among other possibilities, the Standards WG could propose a new WG of EarthCube stakeholders, that would coordinate with a parallel working group in the relevant SDO. This creates “loose coupling” between EarthCube and the SDO, which would be managed by the respective WG co-chairs, and which works best when some members of both working groups are in common. One example of this approach is the recent collaboration between OGC and W3C to integrate spatial data on the web. As they are being formed, EarthCube working groups can consider and apply to EarthCube for coordination funding, such as to cover travel expenses to an ESIP or RDA meeting for discussions; or to an OGC, W3C, or other SDO meeting for standards development. Technical papers would be expected and generated by and about this coordination, leading to broader community understanding and uptake.

Before this can happen the first time, EarthCube would have to negotiate an alliance or Memorandum of Understanding (MoU) with each desired SDO, to prepare institutionally for WG-level collaborations. These and other details will have to be worked out (with some risk of institutional barriers), but I’m optimistic that EarthCube could foster an organizational structure that would represent the academic community’s requirements and contributions in coordination with external SDO working groups.

A big factor in coordination of standards development across alliance partners is continuity of sustained communications and development work. Standards can potentially take several years to develop, which requires a collaboration team prepared to stay engaged together as much as possible. But the main focus for EarthCube working groups in this arrangement would be on the domain & computer science requirements, verification criteria, user interface guidelines, and educational outreach. EarthCube working groups seem adequate for this work, and will be especially effective when key participants are members of both EarthCube and the relevant SDO.

There will be many more discussions about this. If you have some constructive ideas, and/or want to get involved in helping to see this happen, please let me know. 

Thanks to the OGC senior staff for reviewing and suggesting edits to this segment. 

The thoughts and opinions expressed here are those of the contributor alone, and do not necessarily reflect the views of EarthCube's governance elements, funding agency, or staff. 

Bridging across bridges: engaging the geoscience research community in standards development (Part 2 of 3)

posted Jun 22, 2015, 8:05 PM by David Arctur   [ updated Jun 22, 2015, 8:07 PM ]

(Originally posted on OGC blog on Mon, 2015-01-12 13:03)

First, I’d like to mention some comments to Part 1, in which I posed a question: “A small but committed number of academic researchers are helping develop OGC standards, but the vast majority are not. Why do some get into it, and why don't more?”

One strong point from comments (some off-blog) was that domain scientists (i.e., non-computer scientists) should not have to engage in information technology (IT) or data management issues, much less in data standards development. Rather, they should be involved in an advisory capacity, leaving the informatics to informaticians. I completely agree.  Nevertheless, I’ve come across a few rare domain scientists who stand in both the science and IT worlds, and do it well enough that they make things happen on a global scale.  These are outliers, and should not be taken as “typical scientists”. Further, this should not be taken as a criticism of less eclectic scientists. But I’m curious if there are ways to nudge the system, that would create more opportunities and rewards for domain scientists to work with IT and standards folks.

The thing is, “people tend to do what they really want to do”, as a wise supervisor once told me, when I was trying to explain why I wasn’t getting done the things that were his priorities. He also recognized that he got the best work from employees who were tasked to do what they really wanted to do.  I completely agree with this philosophy, so I’m not advocating that domain scientists try becoming good at something they don’t want to do. There’s really an ecosystem of science and technology tasks and people, and we depend on different people wanting to do different tasks.

What I do want to see is that good data management and IT practices become easier and more natural for geoscientists to follow, in fact making it easier to focus on their science, without having to focus so much on the technology.  I want to look at ways to improve the technology of science without distracting the scientists with technology.  Then scientists and researchers will get to do more of what they really want to do.

Examples of where this is starting to happen are the integrated tools/data sets used by the climate/Met research community. Data comes in netCDF format, is processed using CDO or similar, visualized using Ferret or other tools. Same for the Esri and HydroDesktop environments used by many environmental science researchers. Such environments handle the standards-based side of things, allowing users to conduct data search and analysis in a harmonized way. You use what you get, with greatly simplified format and coordinate conversions. But you also don’t explore behind this horizon, most often because you don’t know what is/would be possible. This approach could be taken farther, such as to incorporate the collection and validation of provenance and other metadata earlier and in more context-sensitive ways in users’ workflows.

But integrated environments also don’t cover all use cases. Note that in order to publish the results of scientific research, the underlying data for the research must often be cited if not included in the publication; standardized data management can assist with standardized data citation. This has been an active discussion area in ESIP and RDA but not in the OGC.  (I won’t get into the debate over distinctions between data, database, data sets, and data products; see Joe Hourcle’s excellent and humorous talk on this at Ignite@AGU a couple years back.) This topic addresses an important part of science: reproducibility of data for subsequent verification and reanalysis. The science community would like for citations to enable linking to the cited data set, whatever that takes. New discussions are taking place about citation of highly dynamic data sets. Another area is semantics, which cuts across all communities.

I also want to emphasize that science, technology, and standards development are interdependent and continually evolving.  Case 1: I’ve sat with clients while prototyping a user interface or web page design for them, and gotten this reaction: “You can do that?? Hmm, can you do this [insert wish-list item] too?”  Often the answer would be “why not?”  Standards for data exchange, model integration and visualization make that much more frequent and productive. Case 2: Every now and then something like the Internet or even “just” the iPhone comes along and shakes up the whole fabric of society, technology & science. Various standards have to catch up, and new standards and even whole communities appear. Case 3: As science and technology for satellite-based Earth observation and analysis improve, the complexity and volumes of data increase exponentially, requiring continual evolution in the standards and tools needed to support them.

http://www.xkcd.com/927/
So if science, technology and standards are all interdependent, and standards tend to catch up as needed, what’s the problem? A big problem is that data standards development generally is not supported or rewarded in academia as it is in industry and government agencies. And that means that academic use of standards is limited to what already works for them, with little input to influence the standards evolution. So the standards community is actually missing out on a huge contribution that could conceivably come from academia, and academia is missing out on the rewards of influencing standards to help them do their work more efficiently and transparently.  

I’m not saying academia is completely missing from the OGC; there are over a hundred universities with one or more professors, researchers or students registered on the OGC portal. The majority of these universities are in Europe. The US has only about 30 universities with OGC membership, and very few of these are active in standards development. I would contend that most US university members of OGC are there to learn and master the OGC standards, rather than to help construct and advance the standards to support geoscience research. We’re also not teaching OGC standards widely in academia.    

But NSF could help here, and EarthCube might be the key.

Enter EarthCube:  The US National Science Foundation (NSF) EarthCube program is a long-term initiative to identify and nurture opportunities to make geoscience data, models, workflows, visualization, and decision support available and usable across all geoscience domains.  This is an ambitious undertaking.  Lucky for me, Anna Kelbert just published an excellent overview of the motivation and emerging structure for EarthCube, so I don’t have to repeat all that here. I’ll just say there are now about 30 funded projects in varying stages of completion, and more on the way.  These are in 3 categories: Research Coordination Networks (outreach to potential users), Conceptual Designs (core architecture frameworks), and Building Blocks (technology components). It is intended to be community driven and community governed. 

How it could happen:  In the next segment, I'll propose a way to leverage EarthCube to loosely-couple the NSF research agenda with the key IT standards development agendas. 

Thanks to Joe Hourcle, Ingo Simonis, Scott Simmons and Carl Reed for contributions to this segment. 

The thoughts and opinions expressed here are those of the contributor alone, and do not necessarily reflect the views of EarthCube's governance elements, funding agency, or staff. 

Bridging across bridges: engaging the geoscience research community in standards development (Part 1 of 3)

posted Jun 22, 2015, 7:51 PM by David Arctur   [ updated Jun 22, 2015, 8:08 PM ]

(Originally posted on OGC blog Wed, 2015-01-07 10:57)

A number of academic researchers are committed to helping develop OGC standards, but the vast majority are not. Why do some get into it, and why don't more? For the past several years I've been looking for ways to stimulate academic involvement in geospatial standards development. I'm starting to see some potential, but the cultural & institutional barriers still dominate. 

Btw, I'll be using "involved" and "committed" in the sense of the old line about ham and eggs: the chicken's involved, but the pig's committed (you can google that phrase for an interesting history).

Basically, geoscience researchers are committed to mastering their science, but generally not to mastering data management, standards development, or other "translational" matters for making their research results accessible across domains. Such efforts present whole new learning curves of concepts and processes that have much less potential to advance a scientist's career. The measure of a scientist's career seems to be an index based on the number of written scientific publications and the number of citations of those publications in other publications. This is an arbitrary measure with a number of issues, but widely followed.

Even so, there's a growing, bubbling, emergent force in the sciences that's making interdisciplinary research more important, achievable, and professionally rewarding. The concepts and problems for understanding and modeling the Earth's climate, weather, water cycles, carbon flux, and many other interrelated subjects, are requiring increasing cooperation across subject domains. We're seeing major, long-term initiatives in the US, Europe, Australia and elsewhere now, starting to poke holes in the walls between subject domains, and between science domains and cyberinfrastructure development. I want to review a selective history first for context, then show how the pieces are starting to fit together.

Going back to the early 1990s, the World Wide Web Consortium (W3C), OGC and ISO TC211 started their work about the same time, with distinct but complementary missions to improve information sharing across nations, subject domains, and industries. These are just a few of a veritable ecosystem of formal, international standards development organizations (SDOs) and industry consortia that have gradually strengthened their alliances with each other over the years to better accomplish their goals (e.g., see recent article about OGC & W3C). One of these consortia, the Federation of Earth Science Information Partners (ESIP Federation) which started in the US in the late 1990s, has been focused on collaborative research for Earth and space sciences. An international, all-sciences consortium (not just geo) emerged in 2013, the Research Data Alliance (RDA). Initial sponsors for RDA were the Australian government, the European Commission, and the US National Science Foundation (NSF); NIST and Japan are now getting involved. More about OGC, ESIP and RDA later. 

Internetworking among the geosciences goes back much farther, to the World Data Centres and Federation of Astronomical and Geophysical data analysis Services established by the International Council of Science (ICSU) to manage data generated by the International Geophysical Year (1957–1958). It became clear after the International Polar Year (2007–2008) that these bodies were not able to respond fully to modern data needs and they were thus disbanded by the ICSU General assembly in 2008 and replaced by the ICSU World Data System in 2009.

Another international networking initiative, started in 2005 by the Group on Earth Observations (GEO), is the Global Earth Observation System of Systems (GEOSS). This is being developed to provide tools for data discovery, access, and decision support, with tasks organized by societal benefit areas (SBAs): climate, weather, water, agriculture, energy, health, biodiversity, ecosystems, and disasters. Through the development of a web-based broker that distributes users' queries across dozens of Earth observation catalogues hosted around the world (e.g., Global Change Master Directory, Pangaea, and many others), you can now reach over 14 million data collections through GEOSS; this number is growing rapidly as more agencies' catalogues are registered, week by week. This may seem paltry compared with a Google search, but we're talking about qualified data-searching with geographic and temporal filters, as well as the ability to select specific authoritative international data catalogues. 

This is just a small sampling of efforts around the world to publish and internetwork Earth observations and geoscience data. But the consortia and initiatives just mentioned have emerged as key drivers in yet newer initiatives seeking to "bridge the bridges". The more some folks find out about the world, the more other folks want to be able to relate those findings with someone else's. And as computational and network technology have advanced, it's become both easier and harder: easier to find lots of data, but harder to know what it means, and how to relate all the pieces.

This is where standards come in: without standard vocabularies, taxonomies, metadata descriptions, and interfaces for discovery and access to data, a researcher has a daunting job just putting data from disparate sources into a common framework for analysis. No wonder geoscience researchers don't have time to mess with standards development; they're too busy finding, collecting, calibrating, converting and reformatting data from multiple sources, so they can run their intended geophysical models.   <wink>

Efforts in driving standards to better support geoscience research have varied globally, with three big players emerging: Europe, Australia and the US. Geoscience research in Europe is largely funded by the European Commission through initiatives like the Framework Programme (FP6, FP7) and now Horizon 2020. These programs have helped implement a European Directive called INSPIRE (Infrastructure for Spatial Information in the European Community). This actually defines and mandates use of international standards for geospatial and geoscience information. Much of the core support for GEOSS has also resulted from FP7 projects, some of which are still underway, and more to come from Horizon 2020. Geoscience Australia and CSIRO are the main drivers for the Australian research program. 

I can't say much more about European or Australian research programs, as I've been most involved in the US. But the US situation is what I really want to talk about now. We have some catch-up to do, and it's starting to happen. 

Next: About the NSF EarthCube initiative and its potential relation to standards development. 

Thanks to Mark Parsons and Ingo Simonis for contributions to this segment. 

Caveat: The thoughts and opinions expressed here are those of the contributor alone, and do not necessarily reflect the views of EarthCube's governance elements, funding agency, or staff.

OGC EarthCube Summit Recap

posted Jan 23, 2013, 1:12 PM by David Arctur

(Originally posted on EarthCube blog January 23, 2013 at 3:52pm)

On Friday, January 18, 2013, about 40 folks attended the OGC EarthCube Summit, co-located with the OGC Technical Committee meeting earlier that week. We had two breakouts: one on Architecture, especially to do with brokering requirements; and one about how Standards Development Orgs can relate to EarthCube. These discussions are summarized in the OGC EarthCube Summit Agenda and Report. Here's the final agenda with links to the presentation slides etc. (These are also available in this dropbox folder.)

00-OGC EarthCube Summit Agenda and Report 20130118.ppt (650kb)

01-Eva-NSF Update 1.18.13.pptx (800kb)

02-Joel-OGC Stakeholder Survey Overview.pptx (1.2mb)

02-Joel-OGC Stakeholder Survey supplement.pdf (640kb)

03-George-OGC Support for Geosciences.pptx (6.5mb)

04-Ilya-Interop_Readiness.pptx (2.1mb)

05-Gil-EarthCubeWorkflows.pptx (2.3mb)

06-Philip-Discussion on Decision Support.pptx (6.7mb)

07-Krzysztof-Semantics CG.pdf (1.4mb)

08-Reagan-interoperability mechanisms.pptx (320kb)

09-Lesley-Australia's EarthCube.ppt (28mb-too big for easy upload, see the dropbox folder linked above)

10-WhyEarthCubeMustBroker-sjsk20130110.pdf (113kb)

11-EarthCube Community Newsletter

Thanks to all who helped make this work, from OGC, Esri, NSF, and the participant organizations!


Science Magazine Policy Forum: Standards and Infrastructure for Data Exchange

posted Jan 23, 2013, 1:06 PM by David Arctur

(Originally posted on EarthCube blog November 18, 2012 at 7:15pm)

Those interested in the shape and governance of EarthCube should read this article in Science, vol.338, 12-Oct-2012, pages 196-197, available here (subscription required): http://www.sciencemag.org/content/338/6104/196.full.pdf

Some excerpts and comments:

"Data on the global R&D enterprise are inconsistently structured and shared, which hinders understanding and policy.... Data exchange standards are a first step. We describe administrative and technical demands and opportunities to meet them."

The article goes on to describe the requirements for a distributed data infrastructure:  supporting open source & proprietary data providers, confidentiality, security and licensing, huge data sizes, private workspace "sandboxes", data versioning, and minimizing top-down control and costs. This article mentions only non-geoscience research data, but the defining issues and parameters are so similar to EarthCube, this bears consideration.

"Major data providers, including federal statistical agencies, standards organizations, and private vendors, as well as user communities, should establish a steering committee. The US National Academies Board on Research Data and Information, and the Committee on Data for Science and Technology of the International Council for Science [ICSU CODATA] are among the natural sponsors."

It mentions the Common European Research Information Format (CERIF) data exchange standard as one example of a successful model. And: "The Brazilian Lattes Platform provides an integrated system to manage research information... partnering with CASRAI (Consortia Advancing Standards in Research Administration Information), VIVO (an open research-networking community group), and EuroCRIS (Current Research Information Systems) to identify a shared exchange standard."

Also mentioned:

- Web of Science citation data;

- KNODE's work to link papers and patents;

- vendors working with universities to collect and manage research data;

- user incentives, such as US STAR METRICS initiative;

- standards and interoperability across research reporting systems, for improved attribution of their work;

- Citation standards need to be developed, across disciplines and data types, from data sets to algorithms to organisms;

- Based on such standards, metrics should be developed that cover nonpublication research outputs, to further incentivize researchers' data sharing.

"The new NSF National Center for Science and Engineering Statistics (NCSES) secure data access facility provides a model of managed access to restricted data, balancing security and access for cleared researchers. Such models can work for commercial providers, such as the tiered security MarketScan repository, which contains medical claims data and is used to support analysis of health-care cost and treatment and patient behavior."

Finally:

"Researchers lament the lack of data sharing (6). By linking data and algorithms to the infrastructure, researchers could—with permissions—access other research projects, encouraging replication and resource utilization. Users would register for access to security-sensitive parts of the infrastructure, use public data and tools free of charge, and pay for access to private work spaces, intellectual property–controlled data sets, or customized analytic tools. Users could post comments on components. Providers would document their data with standard metadata, including data elements, sample frame, access levels, terms of use, and any fees, which could vary according to the amount and nature of use (e.g., scholarly, commercial, or algorithm development), and vary across providers (e.g., academics and government agencies might set prices to cover expenses, with commercial providers setting higher prices for different functionality and tools). Payment could be managed much the way e-readers manage access to applications and content. Such a structure minimizes centralized support and subsidies; allows data to be maintained by providers who can manage access, data updates, and algorithms for data processing; and users can distribute their own tools and algorithms. These objectives are in line with the U.S. government memorandum on data sharing and privacy (7). The proposed model offers potential benefits from combining and mining the vast data already available. The first step is to coordinate existing data exchange efforts, the foundation on which the entire effort relies."

Have they been following EarthCube? It doesn't look like it from their references, but it sure sounds like it from the issues and direction.



TED Talk by Clay Shirky, and how open-source tools could enable EarthCube

posted Jan 23, 2013, 1:02 PM by David Arctur

(Originally posted on EarthCube blog, October 8, 2012 at 9:45pm)

The title (and link) for the TED talk is Clay Shirky: How the Internet will (one day) transform government,  about open-source tools and "cooperation without coordination". This talk really addresses some of the core complexities in EarthCube's mission to democratize knowledge. How many ways could we use github to enable EarthCube?

Geosciences in OGC: 2010 Update

posted Jul 17, 2012, 2:13 AM by David Arctur   [ updated Jul 17, 2012, 2:14 AM ]

Posted in OGC News, December 2010, as This Month's Staff Message -- some more of the history of geosciences' emergence in OGC. This unintentionally left out mention of the very significant role of Australia's CSIRO researchers' contributions toward the formation of the Hydro DWG and WaterML.    

Whenever a large office building is being constructed, it seems like a long time goes by at first, with very little visible progress. The foundation and infrastructure are often below ground level and behind a fence, making it even harder to see progress. But once the walls start to go up, every day brings dramatic changes.

We seem to have reached the "wall building" stage now, regarding OGC standards in the geosciences. It was just 2 years ago, after 16 years of work within OGC on the core standards for mapping and sensor observations, that we first talked with the World Meteorological Organization (WMO) about letting us help shape their future standards. At the ‘First workshop on use of GIS/OGC standards in meteorology' OGC and WMO both realized that many members of the meteorology community were using OGC WMS, WCS, and WFS standards in interesting ways, but different than we expected, and not all the same, handling some concepts that were new for us, such as forecast time and mapping of weather variables like temperature and pressure. The workshop, organized by the UK Met Office, the European Centre for Mid to Long Range Weather Forecasting (ECMWF), and Météo France, was a first attempt to determine best practices for using and extending OGC standards in European meteorology.

Similarly in the hydrology field, the US-based Consortium for the Advancement of Hydrologic Science (CUAHSI) had been working for 8 years to evolve a service-oriented architecture for cataloguing time-series observations from the millions of stream gauges in the US maintained by USGS, EPA, NOAA's National Weather Service, and various state agencies. As this system was coming online and being adopted within USGS and NWS, CUAHSI realized that OGC could help ensure global application of the architecture. They started working with us to adapt their initial customized SOA to OGC standards, which turned out to be a good fit. But they realized the need to draw in more hydrology domain scientists, so CUAHSI sent OGC to the WMO Commission for Hydrology (CHy) meeting in Geneva, just a couple weeks before the first meteorology workshop, to see what we could develop.

From the serendipity of these two meetings being co-located with meetings of the WMO's key domains grew a commitment between OGC and WMO, formalized a year later, to support each other's standards program through designated experts chairing the relevant working groups in both organizations. This goes beyond meteorology and hydrology to include climate, oceanography, and atmospheric science commissions within WMO. This ensures that WMO retains intellectual control of the domain sciences of concern to them within OGC, while taking advantage of OGC's strengths in convening interdisciplinary collaborative meetings four times a year, around the world.

Steven Ramage and I recently attended the ‘Third workshop on use of GIS/OGC standards in meteorology' hosted by the UK Met Office at their facilities in Exeter, co-sponsored again by ECMWF and Météo France. The meeting was attended by experts representing national agencies in weather, climate, aviation, and defence from Europe and now North America. This year, OGC members on the European INSPIRE Thematic Working Group on Atmospheric Conditions and Meteorological Geographic Features are planning to implement the INSPIRE requirements for a limited part of the complete information model, through an OGC Interoperability Experiment.   

Still another thread of geosciences outreach for OGC is our participation in the annual conferences of the American Geophysical Union (AGU) in San Francisco each fall, and the European Geophysical Union (EGU) in Vienna each spring. We have chaired sessions at these conferences since 2008 within a newly created domain section of both called Earth and Space Science Informatics (ESSI). At the AGU meeting just held, we saw that OGC standards and practices are being included in the core architectures of important new NSF-funded observation networks and archives, such as DataONE.org and IEDAData.org. We also see new areas we need to work with, such as provenance automation and integrated modeling frameworks. As an indication of growing recognition of the role we can play in fostering interdisciplinary collaborations of domain scientists and cyber-infrastructure developers, we have been asked to be a sponsor at ESSI's future meetings and to contribute to its publications. The future for geosciences in the OGC is bright!

On behalf of everyone on the OGC staff, I thank you for your support in 2010 and wish you all a very happy and successful 2011!

--David Arctur
Director, Interoperability Programs

1-10 of 15

Comments