Working Groups‎ > ‎Use Cases‎ > ‎

Data Consumers

Use Cases

This page contains use cases from data consumers. ie. portal users.


The following use cases are based on feedback from users of the AODN open geospatial portal - version 2.  Version 3 of the AODN portal (otherwise known as 123) was design to address some of these use cases. The original email correspondence is included below.


Download Temperature and Velocity Data from NSW Moorings

The user would like to download temperature and velocity data from NSW moorings without downloading large numbers of NetCDF files, and without needing many clicks.


(Based on feedback from Robin Robertson)


[Why: downloading files individually from a thredds catalog is impractical]

Download Glider Data

The user would like an easy way to download the calibrated glider data.  The user does not want the data delivered manually via drop box, or to face the difficulty of downloading NetCDF files in the way they are currently provided.


(Based on feedback from Robin Robertson)


[Why: downloading files individually from a thredds catalog is impractical]

Download ANMN Timor South moorings data

The user would like to download the ANMN Timor South moorings data  - without needing 160 clicks.


(Based on feedback from Rebecca Cowley)


[Why: downloading files individually from a thredds catalog is impractical]

Download XBT data

The user would like to download XBT data from the portal.  Not just the metadata - but the actual data.


(Based on feedback from Rebecca Cowley)


[Why: metadata is no use if the user can't also get the data]

Download NRS data

The user would like to be able to download NRS moorings data.


(Based on feedback by Peter Thompson)


[Why: downloading files individually from a thredds catalog is impractical]

Understandable by scientists from other fields

The user would like to be able to understand the portal even though she is from another field (e.g. genomics).


(Based on feedback from Levente Bodrossy)


[Why: version 2 of the AODN portal is impossible to understand without tuition]

Filter moorings data by deployment and instrument type

The user would like to be able to filter moorings data by deployment and instrument type.


(Based on feedback from Craig Steinberg)


[Why: downloading files individually from a thredds catalog is impractical]

Download data as CSV

The user would like to download sea surface temperature from the Bass Straight as a CSV file - not NetCDF


(Based on feedback from Andre Chiaradia)


[Why: the majority of users prefer CSV]

Argo float data in a bounding box in the Southern Ocean.

The user would like to download argo data from a particular region in the Southern Ocean.


(Based on feedback from Esmee)


[Why: downloading files individually from a thredds catalog is impractical]

.


Unedited Feedback


From: Robin Robertson <R.Robertson@adfa.edu.au>

To: Peter Blain <Peter.Blain@utas.edu.au>

Subject: RE: Portal use cases

Date: Mon, 23 Sep 2013 06:04:15 +0000


Hi Peter,



I have used the portal on occasion.  I find it awkward.  More importantly there have been a few times when I have not used it.  These are:


1.      I was trying to download all of the velocity and temperature data from all the NSW moorings.  Even all the data for one of the Sydney moorings was taking too long, clicking on the various things.  It took me about 7 hrs for velocity for 1 mooring.  Now it took me 1-2 months to get permission, but eventually they set up an ftp site.  I was able to download all of the data in a few minutes of my time, although it took the computer a bit longer.  So for updating the data set, downloading one or two files by clicking is OK, but when you have to access big portions of it, the portal is too awkward and time consuming


2.      Glider data : they generally give me this through the drop box.  I also download the kmz, but the calibrated data for analysis comes through dropbox. Maybe you want to change this or make it easier to download the netcdf files.


Cheers

Robin

.


From: Peter.A.Thompson@csiro.au [Peter.A.Thompson@csiro.au]

Sent: Monday, 5 November 2012 12:34

To: Marty Hidas

Subject: RE: [General enquiries] data


Thanks Marty, I did ask Val and she pointed me to some files but I cannot locate these files using the portal.


I am hoping you can make this work better.  For example, I think I do a good job of representing a novice when I try to get information from your system.  In this case I started from the IMOS home page. I click on the 'map' box near the statement that there are two ways of using the portal to find data. Then I click on the 'National Moorings Network" which gives 3 choices, one labelled NRS biomass abundance (which does not tell me whether this is phytoplankton, fish or salt?)  another labelled 'NRS zooplankton abundance' and a third labelled 'National Reference Station - delayed".  I am very puzzled that the first two data sets are not delayed, and wonder why the user must guess what this last data set might include.  Curiously there are no more layers showing all the other data sets.  MOST FRUSTRATING, there does not seem to be any way to get from the map to any of these data. The NRS systems are not hot linked to the data. The graphic does not display properly but I cannot find a useful link. There is the "Point of truth URL of this metadata record (whatever that means, but no data),  the "show child records [opens IMOS MEST]' (but I do not want meta data) and National Reference Stations page on IMOS web Page (no data).  The   Possibly this is a browser problem, possibly it is me being to obtuse, possibly it is another example of how this system does not actually work for many users.  I would be REALLY grateful if you could explain why I cannot make this work.


Peter Thompson

Program Leader - Marine Biogeochemistry

CSIRO Marine and Atmospheric Research

Hobart, Tasmania

7001, Australia


ph (03) 6232 5298

mobile 0467795168


.

Hi Peter,

I had to download the ANMN Timor South moorings data from the aodn website.

The main irritation for me was:

  • I wanted to download all the data at once, and thought that using the 'add to cart' option would add the actual data to the cart. However, it seems that it only adds the metadata to the cart. I tried this with the XBT data also, and same thing. Just metadata information, not the actual data.

Either the 'Download Data' option should be changed to read 'Dowload Metadata' or the cart option needs to be fixed to download the data.


So, to get the data was a little painful too. I clicked on the mooring location on the map to get the popup box, then used the Opendap link to get to the catalogue. Issues with using the thredds server directly:

  • I had to navigate to each folder (instruments are separated by measurement type),

  • It was unclear if I was in the QC'd or non-QC'd folders. Although, now I've used it a couple of times, it's not too bad. I think it's just the format of the folders and files that confused me.

  • There is no button to click to go back up a directory, I had to use the browsers 'back' button.

  • I had to click on every file I needed (a total of 80), but it doesn't download straight away. I then had to select an access option. It took me a couple of goes to figure out what worked for me. (I'm not familiar with any of the advantages of each of the four options, I don't even know what WCS and WMS stands for).

That was 160 clicks to get the data. I think it was a little too many!


For moorings data, it would be good to download all the files for a mooring deployment, rather than by instrument. The options are probably that the data is put into one Netcdf file or multiple files and delivered in a tarred/zipped format. From a single mooring deployment point of view, the single file might be suitable, but if the user just wants temperature data in a place at a time, then the single file option might be better.


I hope this feedback helps. I'm happy to expand on anything if needed.


Bec


______________________________________________________

Ocean Data Analyst/Scientific Programmer

CSIRO Marine and Atmospheric Research

Castray Esplanade,

Hobart,

Tasmania, Australia, 7000

Phone: +61 3 6232 5446

Fax: +61 3 6232 5123

email: Rebecca.Cowley@csiro.au


.

Hi Katy,

Remembering what you told about little feedback on data availability... I just looked at the data on large fish tagging:

http://imos.aodn.org.au/webportal/


...and tried to figure out what data the blue/red dots meant. I would like to know what exactly is detected there (anything from sharks to seals to tuna?), and what is the time frame for the counts (I found the meaning of the colours/sizes in the Styles tag).

So, I followed the link on the bottom right:

http://imosmest.aodn.org.au/geonetwork/srv/en/metadata.show?uuid=177e0ad2-8691-4290-875f-5fce23f3b274


...and couldn’t find the information on that.

I think the bottom line is, the data (at least these ones) are labelled by the experts for the experts. If you are from another field with interest in this field, it is pretty hard work to build a picture. Don’t get me wrong, I find this is an awesome resource and achievement as it is already, but if IMOS wants to see the data used across disciplines, more work is needed with the non-initiated in mind.

I hope this helps!

Cheers,

LEv

Levente Bodrossy Ph.D.

Environmental Genomics Team - Science Leader

CSIRO Marine & Atmospheric Research

Phone: +61 3-62325456| Fax: +61 3-62325000

lev.bodrossy@csiro.au | www.csiro.au

Address: CSIRO Marine and Atmospheric Research, GPO box 1538, Hobart, Tasmania, Australia 7001


.


From: Craig Steinberg [C.Steinberg@aims.gov.au]

Sent: Tuesday, 4 October 2011 10:36

To: Marty Hidas

Cc: Katherine Tattersall; Sebastien Mancini

Subject: RE: ITF Shelf moorings data


Hi Marty – i’ll look at it by next week when staff have returned from overseas.

Has a better structure for the data (by deployment and instrument type) been introduced yet?

I don’t see the value of being a pop up window

I’d like to see a written list of things to do to discover the data as i find a lot of my time is telling people how to do this.

Craig


.

From: Esmee (working or the Argo facility at CSIRO)


IMOS Ocean portal test searches


I want all Argo float data in a regional box in the Southern Ocean

I click on Argo floats – this populates the map, I then move to the search tab. In keywords, I enter Argo, then click on geo extent and date range – enter my preferred ranges, then hit search, it comes up with 35 pages of metadata explaining the same repetitive information for 341 floats and 1 AATAMS facility CTD satellite relay tagging program?

I really want data from all floats but there seems to be no way to get this in one go?

When I click on the individual float metadata records it takes me through a long list that I check carefully and finally find a link to ‘data available via the IMOS OpenDAP server’, I click on this and get a server error. The full record tab also notes that this float (a Kordi float) has a cited responsible party of Australian Ocean Data Centre Joint Facility which given this float is Korean is confusing!).

I try another float (an Australian one and this time I get directed to the data files on the OPenDAP server successfully) however to get all the floats I want in my box I would have to individually click on all 341 float tabs and download them separately. Needless to say this is not practical and I’m going to the USGDAC where I know I can get all my data in one go.



.


From: Andre Chiaradia [achiaradia@penguins.org.au]

Sent: Friday, 27 April 2012 16:17

To: Marty Hidas

Subject: RE: EAC and penguins


Hi Marty

Thanks for your email. Appreciated! I had no probs downloading the historical data (up to 5/2008). Thanks. But I had no luck with 2008 to today. Here is my feedback:

·        Follow your instructions. Found hard to find what I want under Biogeochem_timeseries. I am looking for sst but all sub titles suggest sub surface data. I downloaded anyway. But I couldn’t open the file. I am sure there is a piece of software that do that. It would be handy if better explaining headings are available with option to download text files (like the historical dataset).

·        Previously I tried the map path. I couldn’t move beyond the drop window. Sometimes it was impossible to see the options. Note that I am using Firefox which may be a problem.

·        I have also tried to export data from the Java option. I have the data plotted ok but when I asked to export, the window froze. I tried it in Explorer with similar (bad) experience.

I am aware that most of these difficulties are due to my handicap (I just a biologist!). But I thought to send this feedback since you have asked! I am still keen on the data but I will resume my search on Monday!

All the best

Andre



From: Andre Chiaradia [mailto:achiaradia@penguins.org.au]

Sent: Thursday, 26 April 2012 4:44 PM

To: Katy Hill

Subject: RE: EAC and penguins

Hi Katy

Thanks again for your help. Ken said it is unlikely EAC would affect waters of the Bass Strait but suggested to proceed exploring!

I tried to find MITS data in the IMOS site without success. Would you mind to give some hints on how to get hold of these data?

Thanks a lot (and hope I am not being a pest!)

Andre

From: Peter.Jansen@csiro.au [Peter.Jansen@csiro.au]

Sent: Thursday, 13 December 2012 17:01

To: Marty Hidas; Shavawn Donoghue; Thomas Trull

Cc: peter.d.wiley@gmail.com

Subject: RE: SOTS Data


Hi Marty,



> The workflow documents describe the process of collecting and processing the data, from planning through to being *available* via the portal. How easy *accessible* it is is a separate issue. We agree that it is a very important > one, we're certainly putting effort into improving it and we do need as much feedback as we can get to keep us on track, but I think that's separate from the workflows exercise.


You could take that view, unfortunately things are never that simple. The workflow tries to cover the data flow from end to end, starting from the node plan, and should go right through to the writing of scientific papers. At the moment the workflow shows no feed back paths in the workflow, and no checking stages. At the moment it assumes that the data is delivered perfect and final, which we know not to be the case, just look at the email trail to get Pulse-7 data ready. This also does not allow for later adding of other data products, take wave height as an example.


Often the data is not available, and if its not 'accessible' then I would count that as not 'available'. Take Pulse-8 netcdf data file, thats not available, so yes you are correct somethings are not available and some not 'easily accessible'. If available is your goal for the workflow then lets try and achieve that. If every tool you link to on the data page does not work than I would count that as not available, although you are correct, I can write my own code to decode it and access the data, but to the average user its not available. If IMOS is trying to make the data available to the public, then your restricting yourself to those with software engineering (or similar) degrees is a bit restrictive.


It would be more useful if the workflow describes this in more details, i.e. IMOS to deliver to the facility the required format, and headers specified. If the facility could put the data through some checking tool, so they know it was at least going to be compliant then that would make it much easier, and quicker to deliver the data. Checking the data against the IMOS-CF conventions document is very tedious, especially when dealing with large complex data sets.


At the moment I have to support two data delivery routs, one to Tom directly, and one to eMII, if I could send the netcdf file to IMOS and have Tom pick it up from there as cdv, then you would get the data as soon as it was available. But because the data access is so broken (and has been for a long time) no one is willing/able to get the data from the IMOS portal.


Unfortunately if you build a data repository where the the produces of the data cannot check that the data is being output correctly the system just fills up with rubbish data, which cannot be sorted out later. The more cross checks and end to end testing that can be done the greater the chance of the data in the repository being good, and also actually working, and people using it.


So where do I send feedback about usability?


Comments