geoknife package

Jordan Read

20 July, 2022

##Introduction

The geoknife package was created to support web-based geoprocessing of large gridded datasets according to their overlap with landscape (or aquatic/ocean) features that are often irregularly shaped. geoknife creates data access and subsequent geoprocessing requests for the USGS’s Geo Data Portal to carry out on a web server. The results of these requests are available for download after the processes have been completed. This type of workflow has three main advantages: 1) it allows the user to avoid downloading large datasets, 2) it avoids reinventing the wheel for the creation and optimization of complex geoprocessing algorithms, and 3) computing resources are dedicated elsewhere, so geoknife operations do not have much of an impact on a local computer.

Because communication with web resources are central to geoknife operations, users must have an active internet connection. geoknife interacts with a remote server to discover processing capabilities, find already available geospatial areas of interest (these are normally user-uploaded shapefiles), get gridded dataset characteristics, execute geoprocessing requests, and get geoprocessing results.

The main elements of setting up and carrying out a geoknife ‘job’ (a geojob object) include defining the feature of interest ‘stencil’ (a webgeom or simplegeom object), the gridded web dataset ‘fabric’ to be processed (a webdata object), and the the processing algorithm ‘knife’ parameters (a webprocess object). The status of the geojob can be checked with check, and output can be loaded into a data.frame with results. See below for more details.

##Installation To install the stable version of geoknife package with dependencies:

install.packages("geoknife", 
    repos = c("https://owi.usgs.gov/R","https://cran.rstudio.com/"),
    dependencies = TRUE)

Or to install the current development version of the package:

install.packages("devtools")
devtools::install_github('USGS-R/geoknife')

getting started

The geoknife package was created to support web-based geoprocessing of large gridded datasets according to their overlap with landscape (or aquatic/ocean) features that are often irregularly shaped. geoknife creates data access and subsequent geoprocessing requests for the USGS’s Geo Data Portal to carry out on a web server.

geoknife concepts

geoknife has abstractions for web-available gridded data, geospatial features, and geoprocessing details. These abstractions are the basic geoknife arguments of fabric, stencil and knife.
* fabric defines the web data that will be accessed, subset, and processed (see the fabric section for more details). These data are limited to gridded datasets that are web-accessible through the definitions presented in the OPeNDAP section. Metadata for fabric include time, the URL for the data, and variables.
* stencil is the geospatial feature (or set of features) that will be used to delineate specific regions of interest on the fabric (see the stencil section for more details). stencil can include point or polygon groupings of various forms (including classes from the sp R package).
* knife defines the way the analysis will be performed, including the algorithm and version used, the URL that receives the processing request, the statistics returned, and the format of the results (see the knife section for more details).
* The geoknife() function takes the fabric, stencil, and knife, and returns a geojob, which is a live geoprocessing request that will be carried out on a remote web server (see the geojob section for more details). The geojob can be checked by users, and results can be parsed and loaded into the R environment for analyses.

remote processing basics

Because geoknife executes geospatial computations on a remote webserver, the workflow for to execute geoprocessing operations may feel a bit foreign to users who usually performing their analyses on a local computer. To find available datasets and their details (variables, time range, etc.), geoknife must query remote servers because data for use with geoknife is typically hosted on open access servers near the processing service. These operations are covered in detail below, but this section is designed to provide a quick overview.

Interactions with web resources may take on the following forms, and each involve separate requests to various webservers:

  1. Using the query function to figure out what data exist for fabric. This function will request data from a CSW (catalog service for the web) resource and return results, or, if a dataset is already specified, it can be used to query for the variables or time dimension.
  2. Using the query function to use a web resource for the geometry of stencil, including a US State, Level III Ecoregion, and many others.
  3. Submitting a geojob to be processed externally
  4. Checking the status of a geojob
  5. Loading the results from a successful geojob

quick start guide

There are various ways to get up and running quickly with geoknife. See sections below for additional details on any of the following operations. As mentioned above, geoknife has the basic arguments of fabric, stencil and knife. knife is an optional argument, and if not used, a default knife will be used to specify the processing details.

define a stencil that represents the geographic region to slice out of the data

There are many different ways to specify geometry (stencil) for geoknife. The two basic functions that support building stencil objects are simplegeom and webdata:

library(geoknife)

Use a single longitude latitude pair as the geometry with the simplegeom function:

stencil <- simplegeom(c(-89, 46.23))

Or specify a collection of named points in a data.frame (note that naming is important for multi-features because it specifies how the results are filtered):

stencil <- simplegeom(data.frame(
              'point1' = c(-89, 46), 
              'point2' = c(-88.6, 45.2)))

Use a web-available geometry dataset with the webgeom function to specify state boundaries:

stencil <- webgeom('state::New Hampshire')
stencil <- webgeom('state::New Hampshire,Wisconsin,Alabama')

or HUC8s (hydrologic unit code):

stencil <- webgeom('HUC8::09020306,14060009')

display stencil:

stencil
## An object of class "webgeom":
## url: http://cida.usgs.gov/gdp/geoserver/wfs 
## geom: derivative:wbdhu8_alb_simp 
## attribute: HUC_8 
## values: 09020306, 14060009 
## wfs version: 1.1.0

see what other HUCs could be used via the query function:

HUCs <- query(stencil, 'values')

there are thousands of results, but head() will only display a few of them

head(HUCs) 
## [1] "11060006" "11060005" "11060001" "11060004" "11060003"
## [6] "11060002"

define a fabric that represents the underlying data

The Geo Data Portal’s web data catalog is quite extensive, and includes many datasets that can all be processed with geoknife. Check it out at cida.usgs.gov/gdp. This is not a complete list of all relevant datasets that can be accessed and processed. The geoknife package has a number of quick access datasets build in (similar to quick start webgeom objects).

An example of a quick start dataset:

fabric <- webdata('prism')
fabric
## An object of class "webdata":
## times: 1895-01-01T00:00:00Z, 1899-01-01T00:00:00Z
## url: https://cida.usgs.gov/thredds/dodsC/prism_v2 
## variables: ppt

which can be a starting point for the PRISM dataset, as the fields can be modified:

times(fabric) <- c('2002-01-01','2010-01-01')
variables(fabric) <- c('tmx')
fabric
## An object of class "webdata":
## times: 2002-01-01T00:00:00Z, 2010-01-01T00:00:00Z
## url: https://cida.usgs.gov/thredds/dodsC/prism_v2 
## variables: tmx

create the processing job that will carry out the subsetting/summarization task

job <- geoknife(stencil, fabric)

use convienence functions to check on the job:

check(job)
running(job)
error(job)
successful(job)

Cancel a running job:

job <- cancel(job)

Run the job again, but have R wait until the process is finished:

job <- geoknife(stencil, fabric, wait = TRUE)

Load up the output and plot it

data <- result(job)
plot(data[,1:2], ylab = variables(fabric))

For long running processes, it often makes sense to use an email listener:

job <- geoknife(webgeom('state::Wisconsin'), fabric = 'prism', email = 'fake.email@gmail.com')

spatial features (stencil)

The stencil concept in geoknife represents the area(s) of interest for geoprocessing. stencil can be represented by two classes in geoknife: simplegeom and webdata. Any other classes can also be used that can be coerced into either of these two classes (such as data.frame).

simplegeom object

The simplegeom class is designed to hold spatial information from the R environment and make it available to the processing engine. simplegeom is effectively a wrapper for the sp package’s SpatialPolygons class, but also coerces a number of different other types into this class. For example:

Points can be specified as longitude latitude pairs:

stencil <- simplegeom(c(-89, 45.43))

or as a data.frame:

stencil <- simplegeom(data.frame(
              'point1' = c(-89, 46), 
              'point2' = c(-88.6, 45.2)))

Also, a SpatialPolygons object can be used as well (example from sp package):

library(sp)
## Warning: package 'sp' was built under R version 4.2.1
Sr1 = Polygon(cbind(c(2,4,4,1,2),c(2,3,5,4,2)))
Sr2 = Polygon(cbind(c(5,4,2,5),c(2,3,2,2)))
Sr3 = Polygon(cbind(c(4,4,5,10,4),c(5,3,2,5,5)))
Sr4 = Polygon(cbind(c(5,6,6,5,5),c(4,4,3,3,4)), hole = TRUE)

Srs1 = Polygons(list(Sr1), "s1")
Srs2 = Polygons(list(Sr2), "s2")
Srs3 = Polygons(list(Sr3, Sr4), "s3/4")
stencil <- simplegeom(Srl = list(Srs1,Srs2,Srs3), proj4string = CRS("+proj=longlat +datum=WGS84"))
## Warning in value[[3L]](cond): SpatialPolygons support is
## deprecated.

webgeom object

The webgeom class is designed to hold references to web feature service (WFS) details and make it available to the processing engine.

Similar to webdata (see below), the webgeom class has public fields that can be set and accessed using simple methods. Public fields in webgeom:

To create a default webgeom object:

stencil <- webgeom()

The user-level information in webgeom is all available with the webgeom “show” method (or print).

stencil
## An object of class "webgeom":
## url: https://cida.usgs.gov/gdp/geoserver/wfs 
## geom: NA 
## attribute: NA 
## values: NA 
## wfs version: 1.1.0

The public fields can be accessed in by using the field name:

geom(stencil) <- "sample:CONUS_states"
attribute(stencil) <- "STATE"
values(stencil) <- c("Wisconsin","Maine")

quick access to web available data for webgeoms

There are some built in webgeom templates that can be used to figure out the pattern, or to use these datasets for analysis. Currently, the package only supports US States, Level III Ecoregions, or HUC8s:

stencil <- webgeom('state::Wisconsin')
webgeom('state::Wisconsin,Maine')
webgeom('HUC8::09020306,14060009')
webgeom('ecoregion::Colorado Plateaus,Driftless Area')

query function for webgeom

The query function on webgeom can be used to find possible inputs for each public field (other than version and url currently):

query(stencil, 'geoms')
##  [1] "sample:Alaska"                  
##  [2] "upload:CIDA_TEST_"              
##  [3] "sample:CONUS_Climate_Divisions" 
##  [4] "sample:CONUS_states"        
##  [5] "sample:CONUS_states"            
##  [6] "sample:CSC_Boundaries"          
##  [7] "sample:Landscape_Conservation_Cooperatives"             
##  [8] "sample:FWS_LCC"                 
##  [9] "sample:simplified_huc8"     
## [10] "sample:Ecoregions_Level_III"
## [12] "sample:Counties"         
## [13] "sample:nps_boundary_2013"       
## [14] "upload:nrhu_selection"          
## [15] "upload:nrhu_selection_Gallatin" 
## [16] "sample:simplified_HUC8s"        
## [17] "draw:test"
query(stencil, 'attributes')
## [1] "STATE"

gridded data (fabric)

The fabric concept in geoknife represents the gridded dataset that will be operated on by the tool. fabric can be a time-varying dataset (such as PRISM) or a spatial snapshot coverage dataset (such as the NLCD). At present, fabric is limited to datasets that can be accessed using the OPeNDAP protocol or WMS (web map service). Most helper functions in geoknife, including query(fabric,'variables') tend to work better for OPeNDAP datasets.

webdata object

The webdata class holds all the important information for webdatasets in order to make them available for processing by geoknife’s outsourced geoprocessing engine, the Geo Data Portal. Public fields in webdata:

To create a default webdata object:

fabric <- webdata()

The user-level information in webdata is all available with the webdata “show” method (or print).

fabric
## An object of class "webdata":
## times: NA, NA
## url: NA 
## variables: NA

The public fields can be accessed in by using the field name:

times(fabric)
## [1] NA NA
url(fabric) <- 'https://cida.usgs.gov/thredds/dodsC/prism'
variables(fabric) <- 'tmx'

times(fabric)[1] <- as.POSIXct('1990-01-01')

find data that is indexed by the Geo Data Portal catalog

The fabric is specified using the webdata function. geoknife can access a catalog of webdata by using the query function:

webdatasets = query('webdata')
length(webdatasets)
## [1] 190

Interrogating datasets can be done by printing the returned dataset list, which displays the title and the url of each dataset by default (this example truncates the 190 datasets to display 5):

webdatasets[61:65]
## An object of class "datagroup":
## [1] Eighth degree-CONUS Daily Downscaled Climate Projections Minimum and Maximum Temperature 
##   url: http://cida.usgs.gov/thredds/dodsC/dcp/conus_t 
## [2] Eighth degree-CONUS Daily Downscaled Climate Projections Precipitation 
##   url: http://cida.usgs.gov/thredds/dodsC/dcp/conus_pr 
## [3] Future California Basin Characterization Model Downscaled Climate and Hydrology 
##   url: http://cida.usgs.gov/thredds/dodsC/CA-BCM-2014/future 
## [4] GLDAS Version 2.0 Noah 0.25 degree monthly data 
##   url: http://hydro1.sci.gsfc.nasa.gov/dods/GLDAS_NOAH025_M.020 
## [5] GLDAS Version 2.0 Noah 1.0 degree 3-hourly data 
##   url: http://hydro1.sci.gsfc.nasa.gov/dods/GLDAS_NOAH10_3H.020

Finding additional information about a particular dataset is supported by title() and abstract(), which return the dataset titles and abstracts respectively:

title(webdatasets[87])
## [1] "North Central River Forecasting Center - Quantitative Precipitation Estimate Archive"
abstract(webdatasets[87])
## [1] "Radar indicated-rain gage verified and corrected hourly precipitation estimate on a corrected ~4km HRAP grid."

indexing datasets based on order or title are equivalent

fabric <- webdata(webdatasets[99])
evapotran <- webdata(webdatasets['Monthly Conterminous U.S. actual evapotranspiration data'])

To modify the times in fabric, use times():

times(fabric) <- c('1990-01-01','2005-01-01')

Similar to webgeom, the query method can be used on webdata objects:

query(fabric, 'times')
query(fabric, 'variables')

find data that is not indexed by the Geo Data Portal catalog

There are hundreds (or potentially thousands) of additional OPeNDAP datasets that will work with geoknife, but need to be found through web searches or catalogs (e.g., www.esrl.noaa.gov/psd/thredds/dodsC/Datasets, apdrc.soest.hawaii.edu/data/data.php). One such example is Sea Surface Temperature from the Advanced Very High Resolution Radiometer (AVHRR) temperature sensing system. Specifying datasets such as this requires finding out the OPeNDAP endpoint (URL) for the dataset, and specifying it as the url to webdata (we found this example in the extensive apdrc.soest.hawaii.edu/data/data.php catalog):

fabric = webdata(url='dods://apdrc.soest.hawaii.edu/dods/public_data/satellite_product/AVHRR/avhrr_mon')

query for variables doesn’t work for this dataset, because it actually doesn’t have units and therefore “valid” variables are not returned (instead you get an empty list). From the OPeNDAP endpoint, it is clear that this dataset has one variable of interest, which is called ‘sst’:

variables(fabric) <- 'sst'
query(fabric, 'times')
[1] "1985-01-01 UTC" "2003-05-01 UTC"
times(fabric) <- c('1990-01-01','1999-12-31')

plotting the July surface temperature of a spot on the Caspian Sea is done by:

sst = result(geoknife(data.frame('caspian.sea'=c(51,40)), fabric, wait = TRUE))
head(sst)
july.idx <- months(sst$DateTime) == 'July'
plot(sst$DateTime[july.idx], sst$caspian.sea[july.idx], type='l', lwd=2, col='dodgerblue', ylab='Sea Surface Temperature (degC)',xlab=NA)
##     DateTime caspian.sea variable statistic
## 1 1990-01-01      11.250      sst      MEAN
## 2 1990-02-01      10.575      sst      MEAN
## 3 1990-03-01      10.350      sst      MEAN
## 4 1990-04-01      11.400      sst      MEAN
## 5 1990-05-01      14.925      sst      MEAN
## 6 1990-06-01      19.800      sst      MEAN

query function for webdata

The query function works on webdata, similar to how it works for webgeom objects. For the PRISM dataset specified above, the time range of the dataset can come from query with times:

fabric = webdata('prism')
variables(fabric) <- 'ppt'
query(fabric, 'times')
## [1] "1895-01-01 UTC" "2013-02-01 UTC"

likewise, variables with variables:

query(fabric, 'variables')

Note that a variable has to be specified to use the times query:

variables(fabric) <- NA
## [1] "ppt" "tmx" "tmn"

This will fail:

query(fabric, 'times')
 Error in times_query(fabric, knife) : 
  variables cannot be NA for fabric argument 

At present, the geoknife package does not have a query method for dataset urls.

knife object

The webprocess class holds all the important information for geoknife processing details for the outsourced geoprocessing engine, the Geo Data Portal. Public fields in webprocess:

query function for webprocess

The query function works on webprocess, similar to how it works for webgeom and webdata objects. For a default webprocess object, the available algorithms can be queried by:

knife <- webprocess()
query(knife, 'algorithms')
## $`Categorical Coverage Fraction`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCategoricalGridCoverageAlgorithm"
## 
## $`OPeNDAP Subset`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCoverageOPeNDAPIntersectionAlgorithm"
## 
## $`Area Grid Statistics (unweighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureGridStatisticsAlgorithm"
## 
## $`Area Grid Statistics (weighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm"

Changing the webprocess url will modify the endpoint for the query, and different algorithms may be available:

url(knife) <- 'https://cida.usgs.gov/gdp/process/WebProcessingService'
query(knife, 'algorithms')
## $`Categorical Coverage Fraction`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCategoricalGridCoverageAlgorithm"
## 
## $`OPeNDAP Subset`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureCoverageOPeNDAPIntersectionAlgorithm"
## 
## $`Area Grid Statistics (unweighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureGridStatisticsAlgorithm"
## 
## $`Area Grid Statistics (weighted)`
## [1] "gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm"

algorithm

As noted above, the algorithm field in webprocess is a list, specifying the algorithm name and relative path to the algorithm endpoint. To access or change the algorithm:

knife <- webprocess()
algorithm(knife) <- query(knife, 'algorithms')[1]
# -- or --
algorithm(knife) <- list('Area Grid Statistics (weighted)' = 
                           "gov.usgs.cida.gdp.wps.algorithm.FeatureWeightedGridStatisticsAlgorithm")

inputs

getting and setting processInputs for geoknife is currently in. Check back later.

url

The url field in webprocess can be accessed and set as expected:

url(knife) <- 'https://cida.usgs.gov/gdp/process/WebProcessingService'

wait

The wait boolean in webprocess can set during creation:

knife <- webprocess(wait = TRUE)
knife
## An object of class "webprocess":
## url: https://cida.usgs.gov/gdp/process/WebProcessingService 
## algorithm: Area Grid Statistics (weighted) 
## web processing service version: 1.0.0 
## process inputs: 
##    SUMMARIZE_FEATURE_ATTRIBUTE: false
##    SUMMARIZE_TIMESTEP: false
##    REQUIRE_FULL_COVERAGE: true
##    DELIMITER: COMMA
##    STATISTICS: 
##    GROUP_BY: 
## wait: TRUE 
## email: NA

email

The email field in webprocess can be accessed and set as expected:

knife <- webprocess(email = 'fake.email@gmail.com')
knife
## An object of class "webprocess":
## url: https://cida.usgs.gov/gdp/process/WebProcessingService 
## algorithm: Area Grid Statistics (weighted) 
## web processing service version: 1.0.0 
## process inputs: 
##    SUMMARIZE_FEATURE_ATTRIBUTE: false
##    SUMMARIZE_TIMESTEP: false
##    REQUIRE_FULL_COVERAGE: true
##    DELIMITER: COMMA
##    STATISTICS: 
##    GROUP_BY: 
## wait: FALSE 
## email: fake.email@gmail.com

geojob details

The geojob in the geoknife package contains all of the processing configuration details required to execute a processing request to the Geo Data Portal and check up on the state of that request. A geojob object is created using the high-level function geoknife() with the stencil, fabric and optional knife arguments as described above.

geojob class and details

The geojob public fields include:

cancel geojob

The geoknife package currently limits the user processing requests to single-running processes, so as to avoid creating thousands of requests in error, which could overwhelm the processing resources. If there is a reason to support additional jobs at one time, please email the package maintainers with your query.

To cancel and existing job: Cancel a running job but retain the details:

id(job)
## [1] "https://cida.usgs.gov:80/gdp/process/RetrieveResultServlet?id=a264a88c-9672-4029-915b-a09b1403d26a"
job <- cancel(job)
id(job)
## [1] "<no active job>"

To cancel any running job without specifying the geojob reference:

cancel()

geoknife web resources

geoknife outsources all major geospatial processing tasks to a remote server. Because of this, users must have an active internet connection. Problems with connections to datasets or the processing resources are rare, but they do happen. When experiencing a connectivity problem, the best approach is often to try again later or email with any questions. The various web dependencies are described below.

Geo Data Portal

The U.S. Geological Survey’s “Geo Data Portal” (GDP) provides the data access and processing services that are leveraged by the geoknife package. See cida.usgs.gov/gdp for the GDP user interface.