Type: Package
Title: Download Geospatial Data Available from Several Federated Data Sources
Version: 4.3.0
Date: 2025-04-12
Description: Download geospatial data available from several federated data sources (mainly sources maintained by the US Federal government). Currently, the package enables extraction from nine datasets: The National Elevation Dataset digital elevation models (https://www.usgs.gov/3d-elevation-program 1 and 1/3 arc-second; USGS); The National Hydrography Dataset (https://www.usgs.gov/national-hydrography/national-hydrography-dataset; USGS); The Soil Survey Geographic (SSURGO) database from the National Cooperative Soil Survey (https://websoilsurvey.sc.egov.usda.gov/; NCSS), which is led by the Natural Resources Conservation Service (NRCS) under the USDA; the Global Historical Climatology Network (https://www.ncei.noaa.gov/products/land-based-station/global-historical-climatology-network-daily; GHCN), coordinated by National Climatic Data Center at NOAA; the Daymet gridded estimates of daily weather parameters for North America, version 4, available from the Oak Ridge National Laboratory's Distributed Active Archive Center (https://daymet.ornl.gov/; DAAC); the International Tree Ring Data Bank; the National Land Cover Database (https://www.mrlc.gov/; NLCD); the Cropland Data Layer from the National Agricultural Statistics Service (https://www.nass.usda.gov/Research_and_Science/Cropland/SARS1a.php; NASS); and the PAD-US dataset of protected area boundaries (https://www.usgs.gov/programs/gap-analysis-project/science/pad-us-data-overview; USGS).
License: MIT + file LICENSE
URL: https://docs.ropensci.org/FedData/, https://github.com/ropensci/FedData
BugReports: https://github.com/ropensci/FedData/issues
SystemRequirements: GDAL (>= 3.1.0)
Depends: R (≥ 4.1.0)
Imports: curl, httr, dplyr, tibble, tidyr, stringr, igraph, xml2, lifecycle, lubridate, progress, purrr, readr, terra (≥ 1.0), sf (≥ 1.0), ggplot2, glue, magrittr, jsonlite
Encoding: UTF-8
LazyData: true
NeedsCompilation: no
Repository: CRAN
RoxygenNote: 7.3.2
Suggests: arcgislayers (≥ 0.2.0), knitr, leaflet, mapview, ncdf4, rmapshaper, testthat, usethis
Packaged: 2025-04-12 20:38:45 UTC; kyle.bocinsky
Author: R. Kyle Bocinsky ORCID iD [aut, cre, cph], Dylan Beaudette [ctb], Scott Chamberlain [ctb, rev], Jeffrey Hollister [ctb], Julia Gustavsen [rev]
Maintainer: R. Kyle Bocinsky <bocinsky@gmail.com>
Date/Publication: 2025-04-12 21:00:09 UTC

Pipe operator

Description

See magrittr::%>% for details.

Usage

lhs %>% rhs

Arguments

lhs

A value or the magrittr placeholder.

rhs

A function call using the magrittr semantics.

Value

The result of calling rhs(lhs).


Scaffolds the common pattern of selecting a layer and filter a geometry from an ArcGIS feature service.

Description

This function uses the arcgislayers package, which has has had compatibility issues for several commonly used platforms. It is mainly here for historical reasons.

Usage

agol_filter(url, layer_name = NULL, geom, simplify = TRUE)

Arguments

url

the url of the remote resource. Must be of length one.

layer_name

the name(s) associated with the layer you want to retrieve. Can be a character vector. If NULL (the default), iterates through all layers.

geom

an object of class bbox, sfc or sfg used to filter query results based on a predicate function.

simplify

when only one layer exists, just return the sf object or data.frame, otherwise return a list of these objects.

Value

An sf object, or a data.frame, or a list of these objects if layer_name == NULL or if length(layer_name) > 1. Missing layers return "NULL".

Examples

## Not run: 

# Get a single layer
agol_filter(
  url = "https://hydro.nationalmap.gov/arcgis/rest/services/NHDPlus_HR/MapServer/",
  layer_name = "WBDHU12",
  geom = FedData::meve
)

# Can be returned as a list
agol_filter(
  url = "https://hydro.nationalmap.gov/arcgis/rest/services/NHDPlus_HR/MapServer/",
  layer_name = "WBDHU12",
  geom = FedData::meve,
  simplify = FALSE
)

# Get a list with all layers
agol_filter(
  url = "https://hydro.nationalmap.gov/arcgis/rest/services/NHDPlus_HR/MapServer/",
  geom = FedData::meve
)

# Or include a vector of layer names
# Note that missing layers are returned as `NULL` values
agol_filter(
  url = "https://hydro.nationalmap.gov/arcgis/rest/services/NHDPlus_HR/MapServer/",
  layer_name = c(
    "NHDPoint",
    "NetworkNHDFlowline",
    "NonNetworkNHDFlowline",
    "NHDLine",
    "NHDArea",
    "NHDWaterbody"
  ),
  geom = FedData::meve
)

## End(Not run)


Scaffolds the common pattern of selecting a layer and filter a geometry from an ArcGIS feature service.

Description

This function does not use the arcgislayers package, which has has had compatibility issues for several commonly used platforms.

Usage

agol_filter_httr(url, layer_name = NULL, geom, simplify = TRUE)

Arguments

url

the url of the remote resource. Must be of length one.

layer_name

the name(s) associated with the layer you want to retrieve. Can be a character vector. If NULL (the default), iterates through all layers.

geom

an object of class bbox, sfc or sfg used to filter query results based on a predicate function.

simplify

when only one layer exists, just return the sf object or data.frame, otherwise return a list of these objects.

Value

An sf object, or a data.frame, or a list of these objects if layer_name == NULL or if length(layer_name) > 1. Missing layers return "NULL".

Examples

## Not run: 

# Get a single layer
agol_filter_httr(
  url = "https://hydro.nationalmap.gov/arcgis/rest/services/NHDPlus_HR/MapServer/",
  layer_name = "WBDHU12",
  geom = FedData::meve
)

# Can be returned as a list
agol_filter_httr(
  url = "https://hydro.nationalmap.gov/arcgis/rest/services/NHDPlus_HR/MapServer/",
  layer_name = "WBDHU12",
  geom = FedData::meve,
  simplify = FALSE
)

# Get a list with all layers
agol_filter_httr(
  url = "https://hydro.nationalmap.gov/arcgis/rest/services/NHDPlus_HR/MapServer/",
  geom = FedData::meve
)

# Or include a vector of layer names
# Note that missing layers are returned as `NULL` values
agol_filter_httr(
  url = "https://hydro.nationalmap.gov/arcgis/rest/services/NHDPlus_HR/MapServer/",
  layer_name = c(
    "NHDPoint",
    "NetworkNHDFlowline",
    "NonNetworkNHDFlowline",
    "NHDLine",
    "NHDArea",
    "NHDWaterbody"
  ),
  geom = FedData::meve
)

## End(Not run)


Check whether a web service is unavailable, and stop function if necessary.

Description

Check whether a web service is unavailable, and stop function if necessary.

Usage

check_service(x)

Arguments

x

The path to the web service.

Value

Error if service unavailable.


Use curl to download a file.

Description

This function makes it easy to implement timestamping and no-clobber of files.

Usage

download_data(
  url,
  destdir = getwd(),
  timestamping = TRUE,
  nc = FALSE,
  verbose = FALSE,
  progress = FALSE
)

Arguments

url

The location of a file.

destdir

Where the file should be downloaded to.

timestamping

Should only newer files be downloaded?

nc

Should files of the same type not be clobbered?

verbose

Should cURL output be shown?

progress

Should a progress bar be shown with cURL output?

Details

If both timestamping and nc are TRUE, nc behavior trumps timestamping.

Value

A character string of the file path to the downloaded file.


Download the 1-km DAYMET daily weather dataset for a region as a netcdf.

Description

Data are downloaded in the NetCDF format. download_daymet_thredds returns the path to the downloaded NetCDF file.

Usage

download_daymet_thredds(bbox, element, year, region, tempo)

Arguments

bbox

the bounding box in WGS84 coordinates as a comma-separated character vector "xmin,ymin,xmax,ymax"

element

An element to extract.
The available elements are:
dayl = Duration of the daylight period in seconds per day. This calculation is based on the period of the day during which the sun is above a hypothetical flat horizon.
prcp = Daily total precipitation in millimeters per day, sum of all forms converted to water-equivalent. Precipitation occurrence on any given day may be ascertained.
srad = Incident shortwave radiation flux density in watts per square meter, taken as an average over the daylight period of the day. NOTE: Daily total radiation (MJ/m2/day) can be calculated as follows: ((srad (W/m2) * dayl (s/day)) / l,000,000)
swe = Snow water equivalent in kilograms per square meter. The amount of water contained within the snowpack.
tmax = Daily maximum 2-meter air temperature in degrees Celsius.
tmin = Daily minimum 2-meter air temperature in degrees Celsius.
vp = Water vapor pressure in pascals. Daily average partial pressure of water vapor.

year

An integer year to extract.

region

The name of a region. The available regions are:
na = North America
hi = Hawaii
pr = Puerto Rico

tempo

The frequency of the data. The available tempos are:
day = Daily data
mon = Monthly summary data
ann = Annual summary data

Value

A named list of character vectors, each representing the full local paths of the tile downloads.


Download the daily data for a GHCN weather station.

Description

Download the daily data for a GHCN weather station.

Usage

download_ghcn_daily_station(ID, raw.dir, force.redo = FALSE)

Arguments

ID

A character string giving the station ID.

raw.dir

A character string indicating where raw downloaded files should be put.

force.redo

If this weather station has been downloaded before, should it be updated? Defaults to FALSE.

Value

A character string representing the full local path of the GHCN station data.


Download the latest version of the ITRDB.

Description

Downloads and parses the latest zipped (numbered) version of the ITRDB. This function includes improvements to the read_crn function from the dplR library. The principle changes are better parsing of metadata, and support for the Schweingruber-type Tucson format. Chronologies that are unable to be read are reported to the user.

Usage

download_itrdb(
  raw.dir = paste0(tempdir(), "/FedData/raw/itrdb"),
  force.redo = FALSE
)

Arguments

raw.dir

A character string indicating where raw downloaded files should be put. The directory will be created if missing. Defaults to './RAW/ITRDB/'.

force.redo

If a download already exists, should a new one be created? Defaults to FALSE.

Value

A data frame containing all of the ITRDB data.


Download a zipped directory containing a shapefile of the SSURGO study areas.

Description

Download a zipped directory containing a shapefile of the SSURGO study areas.

Usage

download_ssurgo_inventory(raw.dir, ...)

Arguments

raw.dir

A character string indicating where raw downloaded files should be put.

Value

A character string representing the full local path of the SSURGO study areas zipped directory.


Download a zipped directory containing the spatial and tabular data for a SSURGO study area.

Description

download_ssurgo_study_area first tries to download data including a state-specific Access template, then the general US template.

Usage

download_ssurgo_study_area(area, date, raw.dir)

Arguments

area

A character string indicating the SSURGO study area to be downloaded.

date

A character string indicating the date of the most recent update to the SSURGO area for these data. This information may be gleaned from the SSURGO Inventory (get_ssurgo_inventory).

raw.dir

A character string indicating where raw downloaded files should be put.

Value

A character string representing the full local path of the SSURGO study areas zipped directory.


Extract data from a SSURGO database pertaining to a set of mapunits.

Description

extract_ssurgo_data creates a directed graph of the joins in a SSURGO tabular dataset, and then iterates through the tables, only retaining data pertinent to a set of mapunits.

Usage

extract_ssurgo_data(tables, mapunits)

Arguments

tables

A list of SSURGO tabular data.

mapunits

A character vector of mapunits (likely dropped from SSURGO spatial data) defining which mapunits to retain.

Value

A list of extracted SSURGO tabular data.


Download and crop the 1-km DAYMET v4 daily weather dataset.

Description

get_daymet returns a SpatRaster of weather data cropped to a given template study area.

Usage

get_daymet(
  template,
  label,
  elements = c("dayl", "prcp", "srad", "swe", "tmax", "tmin", "vp"),
  years = 1980:(lubridate::year(Sys.time()) - 1),
  region = "na",
  tempo = "day",
  extraction.dir = file.path(tempdir(), "FedData", "extractions", "daymet", label),
  raster.options = c("COMPRESS=DEFLATE", "ZLEVEL=9", "INTERLEAVE=BAND"),
  force.redo = FALSE,
  progress = TRUE
)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping.

label

A character string naming the study area.

elements

A character vector of elements to extract.
The available elements are:
dayl = Duration of the daylight period in seconds per day. This calculation is based on the period of the day during which the sun is above a hypothetical flat horizon.
prcp = Daily total precipitation in millimeters per day, sum of all forms converted to water-equivalent. Precipitation occurrence on any given day may be ascertained.
srad = Incident shortwave radiation flux density in watts per square meter, taken as an average over the daylight period of the day. NOTE: Daily total radiation (MJ/m2/day) can be calculated as follows: ((srad (W/m2) * dayl (s/day)) / l,000,000)
swe = Snow water equivalent in kilograms per square meter. The amount of water contained within the snowpack.
tmax = Daily maximum 2-meter air temperature in degrees Celsius.
tmin = Daily minimum 2-meter air temperature in degrees Celsius.
vp = Water vapor pressure in pascals. Daily average partial pressure of water vapor.

years

A numeric vector of years to extract.

region

The name of a region. The available regions are:
na = North America
hi = Hawaii
pr = Puerto Rico

tempo

The frequency of the data. The available tempos are:
day = Daily data
mon = Monthly summary data
ann = Annual summary data

extraction.dir

A character string indicating where the extracted and cropped DEM should be put. Defaults to a temporary directory.

raster.options

a vector of GDAL options passed to terra::writeRaster.

force.redo

If an extraction for this template and label already exists in extraction.dir, should a new one be created?

progress

Draw a progress bar when downloading?

Value

A named list of SpatRasters of weather data cropped to the extent of the template.

Examples

## Not run: 
library(terra)

# Get the DAYMET (North America only)
# Returns a list of raster bricks
DAYMET <- get_daymet(
  template = FedData::meve,
  label = "meve",
  elements = c("prcp", "tmin", "tmax"),
  years = 1985
)

# Plot with terra::plot
plot(DAYMET$tmin$`1985-10-23`)

## End(Not run)

Download and crop the Global Historical Climate Network-Daily data.

Description

get_ghcn_daily returns a named list of length 2:

  1. 'spatial': A Simple Feature of the locations of GHCN weather stations in the template, and

  2. 'tabular': A named list of type data.frame() with the daily weather data for each station. The name of each list item is the station ID.

Usage

get_ghcn_daily(
  template = NULL,
  label = NULL,
  elements = NULL,
  years = NULL,
  raw.dir = file.path(tempdir(), "FedData", "raw", "ghcn"),
  extraction.dir = file.path(tempdir(), "FedData", "extractions", "ned", label),
  standardize = FALSE,
  force.redo = FALSE
)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping. Alternatively, a character vector providing GHCN station IDs. If missing, all stations will be downloaded!

label

A character string naming the study area.

elements

A character vector of elements to extract.
The five core elements are:
PRCP = Precipitation (tenths of mm)
SNOW = Snowfall (mm)
SNWD = Snow depth (mm)
TMAX = Maximum temperature (tenths of degrees C)
TMIN = Minimum temperature (tenths of degrees C)

The other elements are:

ACMC = Average cloudiness midnight to midnight from 30-second ceilometer data (percent)
ACMH = Average cloudiness midnight to midnight from manual observations (percent)
ACSC = Average cloudiness sunrise to sunset from 30-second ceilometer data (percent)
ACSH = Average cloudiness sunrise to sunset from manual observations (percent)
AWDR = Average daily wind direction (degrees)
AWND = Average daily wind speed (tenths of meters per second)
DAEV = Number of days included in the multiday evaporation total (MDEV)
DAPR = Number of days included in the multiday precipitation total (MDPR)
DASF = Number of days included in the multiday snowfall total (MDSF)
DATN = Number of days included in the multiday minimum temperature (MDTN)
DATX = Number of days included in the multiday maximum temperature (MDTX)
DAWM = Number of days included in the multiday wind movement (MDWM)
DWPR = Number of days with non-zero precipitation included in multiday precipitation total (MDPR)
EVAP = Evaporation of water from evaporation pan (tenths of mm)
FMTM = Time of fastest mile or fastest 1-minute wind (hours and minutes, i.e., HHMM)
FRGB = Base of frozen ground layer (cm)
FRGT = Top of frozen ground layer (cm)
FRTH = Thickness of frozen ground layer (cm)
GAHT = Difference between river and gauge height (cm)
MDEV = Multiday evaporation total (tenths of mm; use with DAEV)
MDPR = Multiday precipitation total (tenths of mm; use with DAPR and DWPR, if available)
MDSF = Multiday snowfall total
MDTN = Multiday minimum temperature (tenths of degrees C; use with DATN)
MDTX = Multiday maximum temperature (tenths of degrees C; use with DATX)
MDWM = Multiday wind movement (km)
MNPN = Daily minimum temperature of water in an evaporation pan (tenths of degrees C)
MXPN = Daily maximum temperature of water in an evaporation pan (tenths of degrees C)
PGTM = Peak gust time (hours and minutes, i.e., HHMM)
PSUN = Daily percent of possible sunshine (percent)
SN*# = Minimum soil temperature (tenths of degrees C) where * corresponds to a code for ground cover and # corresponds to a code for soil depth.

Ground cover codes include the following:
0 = unknown
1 = grass
2 = fallow
3 = bare ground
4 = brome grass
5 = sod
6 = straw multch
7 = grass muck
8 = bare muck

Depth codes include the following:
1 = 5 cm
2 = 10 cm
3 = 20 cm
4 = 50 cm
5 = 100 cm
6 = 150 cm
7 = 180 cm

SX*# = Maximum soil temperature (tenths of degrees C) where * corresponds to a code for ground cover and # corresponds to a code for soil depth.
See SN*# for ground cover and depth codes.
TAVG = Average temperature (tenths of degrees C) (Note that TAVG from source 'S' corresponds to an average for the period ending at 2400 UTC rather than local midnight)
THIC = Thickness of ice on water (tenths of mm)
TOBS = Temperature at the time of observation (tenths of degrees C)
TSUN = Daily total sunshine (minutes)
WDF1 = Direction of fastest 1-minute wind (degrees)
WDF2 = Direction of fastest 2-minute wind (degrees)
WDF5 = Direction of fastest 5-second wind (degrees)
WDFG = Direction of peak wind gust (degrees)
WDFI = Direction of highest instantaneous wind (degrees)
WDFM = Fastest mile wind direction (degrees)
WDMV = 24-hour wind movement (km)
WESD = Water equivalent of snow on the ground (tenths of mm)
WESF = Water equivalent of snowfall (tenths of mm)
WSF1 = Fastest 1-minute wind speed (tenths of meters per second)
WSF2 = Fastest 2-minute wind speed (tenths of meters per second)
WSF5 = Fastest 5-second wind speed (tenths of meters per second)
WSFG = Peak gust wind speed (tenths of meters per second)
WSFI = Highest instantaneous wind speed (tenths of meters per second)
WSFM = Fastest mile wind speed (tenths of meters per second)
WT** = Weather Type where ** has one of the following values:

01 = Fog, ice fog, or freezing fog (may include heavy fog)
02 = Heavy fog or heaving freezing fog (not always distinguished from fog)
03 = Thunder
04 = Ice pellets, sleet, snow pellets, or small hail
05 = Hail (may include small hail)
06 = Glaze or rime
07 = Dust, volcanic ash, blowing dust, blowing sand, or blowing obstruction
08 = Smoke or haze
09 = Blowing or drifting snow
10 = Tornado, waterspout, or funnel cloud
11 = High or damaging winds
12 = Blowing spray
13 = Mist
14 = Drizzle
15 = Freezing drizzle
16 = Rain (may include freezing rain, drizzle, and freezing drizzle)
17 = Freezing rain
18 = Snow, snow pellets, snow grains, or ice crystals
19 = Unknown source of precipitation
21 = Ground fog
22 = Ice fog or freezing fog

WV** = Weather in the Vicinity where ** has one of the following values:
01 = Fog, ice fog, or freezing fog (may include heavy fog)
03 = Thunder
07 = Ash, dust, sand, or other blowing obstruction
18 = Snow or ice crystals
20 = Rain or snow shower

years

A numeric vector indicating which years to get.

raw.dir

A character string indicating where raw downloaded files should be put. The directory will be created if missing. Defaults to './RAW/GHCN/'.

extraction.dir

A character string indicating where the extracted and cropped GHCN shapefiles should be put. The directory will be created if missing. Defaults to './EXTRACTIONS/GHCN/'.

standardize

Select only common year/month/day? Defaults to FALSE.

force.redo

If an extraction for this template and label already exists, should a new one be created? Defaults to FALSE.

Value

A named list containing the 'spatial' and 'tabular' data.

Examples

## Not run: 
# Get the daily GHCN data (GLOBAL)
# Returns a list: the first element is the spatial locations of stations,
# and the second is a list of the stations and their daily data
GHCN.prcp <-
  get_ghcn_daily(
    template = FedData::meve,
    label = "meve",
    elements = c("prcp")
  )

# Plot the VEP polygon
plot(meve)

# Plot the spatial locations
plot(GHCN.prcp$spatial$geometry, pch = 1, add = TRUE)
legend("bottomleft", pch = 1, legend = "GHCN Precipitation Records")

# Elements for which you require the same data
# (i.e., minimum and maximum temperature for the same days)
# can be standardized using `standardize = TRUE`
GHCN.temp <- get_ghcn_daily(
  template = FedData::meve,
  label = "meve",
  elements = c("tmin", "tmax"),
  standardize = TRUE
)

# Plot the VEP polygon
plot(meve)

# Plot the spatial locations
plot(GHCN.temp$spatial$geometry, pch = 1, add = TRUE)
legend("bottomleft", pch = 1, legend = "GHCN Temperature Records")

## End(Not run)

Download and extract the daily data for a GHCN weather station.

Description

get_ghcn_daily_station returns a named list of data.frames, one for each elements. If elements is undefined, it returns all available weather tables for the station

Usage

get_ghcn_daily_station(
  ID,
  elements = NULL,
  years = NULL,
  raw.dir,
  standardize = FALSE,
  force.redo = FALSE
)

Arguments

ID

A character string giving the station ID.

elements

A character vector of elements to extract.
The five core elements are:
PRCP = Precipitation (tenths of mm)
SNOW = Snowfall (mm)
SNWD = Snow depth (mm)
TMAX = Maximum temperature (tenths of degrees C)
TMIN = Minimum temperature (tenths of degrees C)

The other elements are:

ACMC = Average cloudiness midnight to midnight from 30-second ceilometer data (percent)
ACMH = Average cloudiness midnight to midnight from manual observations (percent)
ACSC = Average cloudiness sunrise to sunset from 30-second ceilometer data (percent)
ACSH = Average cloudiness sunrise to sunset from manual observations (percent)
AWDR = Average daily wind direction (degrees)
AWND = Average daily wind speed (tenths of meters per second)
DAEV = Number of days included in the multiday evaporation total (MDEV)
DAPR = Number of days included in the multiday precipitation total (MDPR)
DASF = Number of days included in the multiday snowfall total (MDSF)
DATN = Number of days included in the multiday minimum temperature (MDTN)
DATX = Number of days included in the multiday maximum temperature (MDTX)
DAWM = Number of days included in the multiday wind movement (MDWM)
DWPR = Number of days with non-zero precipitation included in multiday precipitation total (MDPR)
EVAP = Evaporation of water from evaporation pan (tenths of mm)
FMTM = Time of fastest mile or fastest 1-minute wind (hours and minutes, i.e., HHMM)
FRGB = Base of frozen ground layer (cm)
FRGT = Top of frozen ground layer (cm)
FRTH = Thickness of frozen ground layer (cm)
GAHT = Difference between river and gauge height (cm)
MDEV = Multiday evaporation total (tenths of mm; use with DAEV)
MDPR = Multiday precipitation total (tenths of mm; use with DAPR and DWPR, if available)
MDSF = Multiday snowfall total
MDTN = Multiday minimum temperature (tenths of degrees C; use with DATN)
MDTX = Multiday maximum temperature (tenths of degrees C; use with DATX)
MDWM = Multiday wind movement (km)
MNPN = Daily minimum temperature of water in an evaporation pan (tenths of degrees C)
MXPN = Daily maximum temperature of water in an evaporation pan (tenths of degrees C)
PGTM = Peak gust time (hours and minutes, i.e., HHMM)
PSUN = Daily percent of possible sunshine (percent)
SN*# = Minimum soil temperature (tenths of degrees C) where * corresponds to a code for ground cover and # corresponds to a code for soil depth.

Ground cover codes include the following:
0 = unknown
1 = grass
2 = fallow
3 = bare ground
4 = brome grass
5 = sod
6 = straw multch
7 = grass muck
8 = bare muck

Depth codes include the following:
1 = 5 cm
2 = 10 cm
3 = 20 cm
4 = 50 cm
5 = 100 cm
6 = 150 cm
7 = 180 cm

SX*# = Maximum soil temperature (tenths of degrees C) where * corresponds to a code for ground cover and # corresponds to a code for soil depth.
See SN*# for ground cover and depth codes.
TAVG = Average temperature (tenths of degrees C) (Note that TAVG from source 'S' corresponds to an average for the period ending at 2400 UTC rather than local midnight)
THIC = Thickness of ice on water (tenths of mm)
TOBS = Temperature at the time of observation (tenths of degrees C)
TSUN = Daily total sunshine (minutes)
WDF1 = Direction of fastest 1-minute wind (degrees)
WDF2 = Direction of fastest 2-minute wind (degrees)
WDF5 = Direction of fastest 5-second wind (degrees)
WDFG = Direction of peak wind gust (degrees)
WDFI = Direction of highest instantaneous wind (degrees)
WDFM = Fastest mile wind direction (degrees)
WDMV = 24-hour wind movement (km)
WESD = Water equivalent of snow on the ground (tenths of mm)
WESF = Water equivalent of snowfall (tenths of mm)
WSF1 = Fastest 1-minute wind speed (tenths of meters per second)
WSF2 = Fastest 2-minute wind speed (tenths of meters per second)
WSF5 = Fastest 5-second wind speed (tenths of meters per second)
WSFG = Peak gust wind speed (tenths of meters per second)
WSFI = Highest instantaneous wind speed (tenths of meters per second)
WSFM = Fastest mile wind speed (tenths of meters per second)
WT** = Weather Type where ** has one of the following values:

01 = Fog, ice fog, or freezing fog (may include heavy fog)
02 = Heavy fog or heaving freezing fog (not always distinguished from fog)
03 = Thunder
04 = Ice pellets, sleet, snow pellets, or small hail
05 = Hail (may include small hail)
06 = Glaze or rime
07 = Dust, volcanic ash, blowing dust, blowing sand, or blowing obstruction
08 = Smoke or haze
09 = Blowing or drifting snow
10 = Tornado, waterspout, or funnel cloud
11 = High or damaging winds
12 = Blowing spray
13 = Mist
14 = Drizzle
15 = Freezing drizzle
16 = Rain (may include freezing rain, drizzle, and freezing drizzle)
17 = Freezing rain
18 = Snow, snow pellets, snow grains, or ice crystals
19 = Unknown source of precipitation
21 = Ground fog
22 = Ice fog or freezing fog

WV** = Weather in the Vicinity where ** has one of the following values:
01 = Fog, ice fog, or freezing fog (may include heavy fog)
03 = Thunder
07 = Ash, dust, sand, or other blowing obstruction
18 = Snow or ice crystals
20 = Rain or snow shower

years

A numeric vector indicating which years to get.

raw.dir

A character string indicating where raw downloaded files should be put.

standardize

Select only common year/month/day? Defaults to FALSE.

force.redo

If this weather station has been downloaded before, should it be updated? Defaults to FALSE.

Value

A named list of data.frames, one for each elements.


Download and crop the inventory of GHCN stations.

Description

get_ghcn_inventory returns a SpatialPolygonsDataFrame of the GHCN stations within the specified template. If template is not provided, returns the entire GHCN inventory.

Usage

get_ghcn_inventory(template = NULL, elements = NULL, raw.dir)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping.

elements

A character vector of elements to extract. Common elements include 'tmin', 'tmax', and 'prcp'.

raw.dir

A character string indicating where raw downloaded files should be put. The directory will be created if missing.

Details

Stations with multiple elements will have multiple points. This allows for easy mapping of stations by element availability.

Value

A Simple Feature of the GHCN stations within the specified template


Download the latest version of the ITRDB, and extract given parameters.

Description

get_itrdb returns a named list of length 3:

  1. 'metadata': A data frame or Simple Feature (if makeSpatial==TRUE) of the locations and names of extracted ITRDB chronologies,

  2. 'widths': A matrix of tree-ring widths/densities given user selection, and

  3. 'depths': A matrix of tree-ring sample depths.

Usage

get_itrdb(
  template = NULL,
  label = NULL,
  recon.years = NULL,
  calib.years = NULL,
  species = NULL,
  measurement.type = NULL,
  chronology.type = NULL,
  raw.dir = paste0(tempdir(), "/FedData/raw/itrdb"),
  extraction.dir = ifelse(!is.null(label), paste0(tempdir(),
    "/FedData/extractions/itrdb/", label, "/"), paste0(tempdir(),
    "/FedData/extractions/itrdb")),
  force.redo = FALSE
)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping. If missing, all available global chronologies are returned.

label

A character string naming the study area.

recon.years

A numeric vector of years over which reconstructions are needed; if missing, the union of all years in the available chronologies are given.

calib.years

A numeric vector of all required years—chronologies without these years will be discarded; if missing, all available chronologies are given.

species

A character vector of 4-letter tree species identifiers; if missing, all available chronologies are given.

measurement.type

A character vector of measurement type identifiers. Options include:

  • 'Total Ring Density'

  • 'Earlywood Width'

  • 'Earlywood Density'

  • 'Latewood Width'

  • 'Minimum Density'

  • 'Ring Width'

  • 'Latewood Density'

  • 'Maximum Density'

  • 'Latewood Percent'

if missing, all available chronologies are given.

chronology.type

A character vector of chronology type identifiers. Options include:

  • 'ARSTND'

  • 'Low Pass Filter'

  • 'Residual'

  • 'Standard'

  • 'Re-Whitened Residual'

  • 'Measurements Only'

if missing, all available chronologies are given.

raw.dir

A character string indicating where raw downloaded files should be put. The directory will be created if missing.

extraction.dir

A character string indicating where the extracted and cropped ITRDB dataset should be put. The directory will be created if missing.

force.redo

If an extraction already exists, should a new one be created? Defaults to FALSE.

Value

A named list containing the 'metadata', 'widths', and 'depths' data.

Examples

## Not run: 
# Get the ITRDB records
ITRDB <- get_itrdb(
  template = FedData::meve,
  label = "meve"
)

# Plot the VEP polygon
plot(meve)

# Map the locations of the tree ring chronologies
plot(ITRDB$metadata$geometry, pch = 1, add = TRUE)
legend("bottomleft", pch = 1, legend = "ITRDB chronologies")

## End(Not run)

Download and crop the NASS Cropland Data Layer.

Description

get_nass_cdl returns a SpatRaster of NASS Cropland Data Layer cropped to a given template study area.

Usage

get_nass_cdl(
  template,
  label,
  year = 2019,
  extraction.dir = paste0(tempdir(), "/FedData/"),
  raster.options = c("COMPRESS=DEFLATE", "ZLEVEL=9", "INTERLEAVE=BAND"),
  force.redo = FALSE,
  progress = TRUE
)

get_nass(template, label, ...)

get_cdl(template, label, ...)

cdl_colors()

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping.

label

A character string naming the study area.

year

An integer representing the year of desired NASS Cropland Data Layer product. Acceptable values are 2007–the last year.

extraction.dir

A character string indicating where the extracted and cropped NASS data should be put. The directory will be created if missing.

raster.options

a vector of options for terra::writeRaster.

force.redo

If an extraction for this template and label already exists, should a new one be created?

progress

Draw a progress bar when downloading?

...

Other parameters passed on to get_nass_cdl.

Value

A SpatRaster cropped to the bounding box of the template.

Examples

## Not run: 
# Extract data for the Mesa Verde National Park:

# Get the NASS CDL (USA ONLY)
# Returns a raster
NASS <-
  get_nass_cdl(
    template = FedData::meve,
    label = "meve",
    year = 2011
  )

# Plot with terra::plot
terra::plot(NASS)

## End(Not run)

Download and crop the 1 (~30 meter) or 1/3 (~10 meter) arc-second National Elevation Dataset.

Description

get_ned returns a SpatRaster of elevation data cropped to a given template study area.

Usage

get_ned(
  template,
  label,
  res = "1",
  extraction.dir = file.path(tempdir(), "FedData", "extractions", "ned", label),
  raster.options = c("COMPRESS=DEFLATE", "ZLEVEL=9"),
  force.redo = FALSE
)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping.

label

A character string naming the study area.

res

A character string representing the desired resolution of the NED. '1' indicates the 1 arc-second NED (the default), while '13' indicates the 1/3 arc-second dataset.

extraction.dir

A character string indicating where the extracted and cropped DEM should be put. The directory will be created if missing.

raster.options

a vector of GDAL options passed to terra::writeRaster.

force.redo

If an extraction for this template and label already exists, should a new one be created?

Value

A SpatRaster DEM cropped to the extent of the template.

Examples

## Not run: 
# Get the NED (USA ONLY)
# Returns a `SpatRaster`
NED <-
  get_ned(
    template = FedData::meve,
    label = "meve"
  )

# Plot with terra::plot
terra::plot(NED)

## End(Not run)

Load and crop tile from the 1 (~30 meter) or 1/3 (~10 meter) arc-second National Elevation Dataset.

Description

get_ned_tile returns aSpatRaster cropped within the specified template. If template is not provided, returns the entire NED tile.

Usage

get_ned_tile(template = NULL, res = "1", tileNorthing, tileWesting)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping. If missing, entire tile is returned.

res

A character string representing the desired resolution of the NED. '1' indicates the 1 arc-second NED (the default), while '13' indicates the 1/3 arc-second dataset.

tileNorthing

An integer representing the northing (latitude, in degrees north of the equator) of the northwest corner of the tile to be downloaded.

tileWesting

An integer representing the westing (longitude, in degrees west of the prime meridian) of the northwest corner of the tile to be downloaded.

Value

A SpatRaster cropped to the extent of the template.


Download and crop the National Hydrography Dataset.

Description

get_nhd returns a list of Simple Feature objects extracted from the National Hydrography Dataset.

Usage

get_nhd(
  template,
  label,
  nhdplus = FALSE,
  extraction.dir = file.path(tempdir(), "FedData", "extractions", "nhd", label),
  force.redo = FALSE
)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping.

label

A character string naming the study area.

nhdplus

Extract data from the USGS NHDPlus High Resolution service (experimental)

extraction.dir

A character string indicating where the extracted and cropped NHD data should be put.

force.redo

If an extraction for this template and label already exists, should a new one be created?

Value

A list of sf collections extracted from the National Hydrography Dataset.

Examples

## Not run: 
# Get the NHD (USA ONLY)
NHD <- get_nhd(
  template = FedData::meve,
  label = "meve"
)
NHD
NHD %>%
  plot_nhd(template = FedData::meve)

## End(Not run)

Download and crop the National Land Cover Database.

Description

get_nlcd returns a SpatRaster of NLCD data cropped to a given template study area. nlcd_colors and pal_nlcd return the NLCD legend and color palette, as available through the MLRC website.

Usage

get_nlcd(
  template,
  label,
  year = 2021,
  dataset = "landcover",
  landmass = "L48",
  extraction.dir = file.path(tempdir(), "FedData", "extractions", "nlcd", label),
  raster.options = c("COMPRESS=DEFLATE", "ZLEVEL=9"),
  force.redo = FALSE
)

nlcd_colors()

pal_nlcd()

Arguments

template

An Simple Feature or terra object to serve as a template for cropping.

label

A character string naming the study area.

year

An integer representing the year of desired NLCD product. Acceptable values are 2019 (default), 2016, 2011, 2008, 2006, 2004, and 2001. The L48 data set for 2021 is corrupted on the NLCD Mapserver, and is thus not available through FedData.

dataset

A character string representing type of the NLCD product. Acceptable values are 'landcover' (default), 'impervious', and 'canopy'.

landmass

A character string representing the landmass to be extracted Acceptable values are 'L48' (lower 48 US states, the default), 'AK' (Alaska, 2001, 2011 and 2016 only), 'HI' (Hawaii, 2001 only), and 'PR' (Puerto Rico, 2001 only).

extraction.dir

A character string indicating where the extracted and cropped NLCD data should be put. The directory will be created if missing.

raster.options

a vector of GDAL options passed to terra::writeRaster.

force.redo

If an extraction for this template and label already exists, should a new one be created?

Value

A RasterLayer cropped to the bounding box of the template.

Examples

## Not run: 
# Extract data for the Mesa Verde National Park:

# Get the NLCD (USA ONLY)
# Returns a raster
NLCD <-
  get_nlcd(
    template = FedData::meve,
    label = "meve",
    year = 2016
  )

# Plot with terra::plot
terra::plot(NLCD)

## End(Not run)

Download and crop the Annual National Land Cover Database.

Description

get_nlcd_annual returns a SpatRaster of NLCD data cropped to a given template study area. The Annual NLCD is currently only available for the conterminous United States. More information about the Annual NLCD product is available on the Annual NLCD web page.

Usage

get_nlcd_annual(
  template,
  label,
  year = 2023,
  product = "LndCov",
  region = "CU",
  collection = 1,
  version = 0,
  extraction.dir = file.path(tempdir(), "FedData", "extractions", "nlcd_annual", label),
  raster.options = c("COMPRESS=DEFLATE", "ZLEVEL=9"),
  force.redo = FALSE
)

Arguments

template

An Simple Feature or terra object to serve as a template for cropping.

label

A character string naming the study area.

year

An integer vector representing the year of desired NLCD product. Acceptable values are currently 1985 through 2023 (defaults to 2023).

product

A character vector representing type of the NLCD product. Defaults to 'LndCov' (Land Cover).
LndCov = Land Cover
LndChg = Land Cover Change
LndCnf = Land Cover Confidence
FctImp = Fractional Impervious Surface
ImpDsc = Impervious Descriptor
SpcChg = Spectral Change Day of Year

region

A character string representing the region to be extracted Acceptable values are 'CU' (Conterminous US, the default), 'AK' (Alaska), and 'HI' (Hawaii). Currently, only 'CU' is available.

collection

An integer representing the collection number. Currently, only '1' is available.

version

An integer representing the version number. Currently, only '0' is available.

extraction.dir

A character string indicating where the extracted and cropped NLCD data should be put. The directory will be created if missing.

raster.options

a vector of GDAL options passed to terra::writeRaster.

force.redo

If an extraction for this template and label already exists, should a new one be created?

Value

A RasterLayer cropped to the bounding box of the template.

Examples

## Not run: 
# Extract data for the Mesa Verde National Park:

# Get the NLCD (USA ONLY)
# Returns a raster
NLCD_ANNUAL <-
  get_nlcd_annual(
    template = FedData::meve,
    label = "meve",
    year = 2020,
    product =
      c(
        "LndCov",
        "LndChg",
        "LndCnf",
        "FctImp",
        "ImpDsc",
        "SpcChg"
      )
  )

NLCD_ANNUAL

## End(Not run)

Download and crop the PAD-US Dataset.

Description

get_padus returns a list of sf objects extracted from the PAD-US Dataset. Data are retrieved directly from PAD-US ArcGIS Web Services.

Usage

get_padus(
  template,
  label,
  layer = c("Manager_Name"),
  extraction.dir = file.path(tempdir(), "FedData", "extractions", "padus", label),
  force.redo = FALSE
)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping. Optionally, a vector of unit names, e.g., c('Mesa Verde National Park','Ute Mountain Reservation') may be provided.

label

A character string naming the study area.

layer

A character vector containing one or more PAD-US Layers. By default, the Manager_Name layer is downloaded.

  • Protection_Status_by_GAP_Status_Code: PAD-US Protection Status by GAP Status Code — Service representing a measure of management intent to permanently protect biodiversity. GAP 1&2 areas are primarily managed for biodiversity, GAP 3 are managed for multiple uses including conservation and extraction, GAP 4 no known mandate for biodiversity protection. GAP Status Codes 1-3 are displayed, GAP 4 areas included but not displayed.

  • Public_Access: PAD-US Public Access — Service representing general level of public access permitted in the area - Open, Restricted (permit, seasonal), Closed. Public Access Unknown areas not displayed. Use to show general categories of public access (however, not all areas have been locally reviewed).

  • Fee_Managers: PAD-US Fee Managers — Service providing manager or administrative agency names standardized nationally. Use for categorization by manager name, with detailed federal managers and generic state/local/other managers. Where available this layer includes fee simple parcels from the Fee feature class plus DOD and Tribal areas from the Proclamation feature class.

  • Manager_Name: PAD-US Manager Name — Service representing fine level manager or administrative agency name standardized for the Nation (USFS, BLM, State Fish and Wildlife, State Parks and Rec, City, NGO, etc). This map is based on the PAD-US Combined Proclamation, Marine, Fee, Designation, Easement feature class. DOD and Tribal areas shown with 50% transparency. Use for categorization by manager name, with detailed federal managers and generic state/local/other managers.

  • Manager_Type: PAD-US Manager Type — Service representing coarse level land manager description from "Agency Type" Domain, "Manager Type" Field (for example, Federal, Tribal, State, Local Gov, Private). Use for broad categorization of manager levels, for general depictions of who manages what areas.

  • Federal_Fee_Managers_Authoritative: PAD-US Federal Fee Managers Authoritative — Service describing authoritative fee data for federal managers or administrative agencies by name. U.S. Department of Defense and Tribal areas shown from the Proclamation feature class. Use to depict authoritative fee data for individual federal management agencies (no state, local or private lands). This service does not include designations that often overlap state, private or other inholdings. U.S. Department of Defense internal land ownership is not represented but is implied Federal. See the Federal Management Agencies service for a combined view of fee ownership, designations, and easements.

  • Federal_Management_Agencies: PAD-US Federal Management Agencies — Service providing Federal managers or administrative agencies by name. Use to depict individual federal management agencies (no state, local or private lands). This map is based on the Combined Proclamation, Marine, Fee, Designation, Easement feature class.

  • Protection_Mechanism_Category: PAD-US Protection Mechanism Category — Service representing the protection mechanism category including fee simple, internal management designations, easements, leases and agreements, and Marine Areas. Use to show categories of land tenure for all protected areas, including marine areas.

  • Proclamation_and_Other_Planning_Boundaries: PAD-US Proclamation and Other Planning Boundaries — Service representing boundaries that provide additional context. Administrative agency name standardized for the nation (DOD, FWS, NPS, USFS, Tribal). Boundaries shown with outline only, as proclamation data do not depict actual ownership or management. Use to show outline of agency proclamation, approved acquisition or other planning boundaries where internal ownership is not depicted.

extraction.dir

A character string indicating where the extracted and cropped PAD-US data should be put.

force.redo

If an extraction for this template and label already exists, should a new one be created?

Details

PAD-US is America’s official national inventory of U.S. terrestrial and marine protected areas that are dedicated to the preservation of biological diversity and to other natural, recreation and cultural uses, managed for these purposes through legal or other effective means. PAD-US also includes the best available aggregation of federal land and marine areas provided directly by managing agencies, coordinated through the Federal Geographic Data Committee Federal Lands Working Group.

Value

A list of sf::sf collections extracted from the PAD-US Dataset.

Examples

## Not run: 
# Get the PAD-US (USA ONLY)
PADUS <- get_padus(
  template = FedData::meve,
  label = "meve"
)
PADUS

## End(Not run)

Download and crop data from the NRCS SSURGO soils database.

Description

This is an efficient method for spatially merging several different soil survey areas as well as merging their tabular data.

Usage

get_ssurgo(
  template,
  label,
  raw.dir = paste0(tempdir(), "/FedData/raw/ssurgo"),
  extraction.dir = paste0(tempdir(), "/FedData/"),
  force.redo = FALSE
)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping. Optionally, a vector of area names, e.g., c('IN087','IN088') may be provided.

label

A character string naming the study area.

raw.dir

A character string indicating where raw downloaded files should be put. The directory will be created if missing. Defaults to './RAW/SSURGO/'.

extraction.dir

A character string indicating where the extracted and cropped SSURGO shapefiles should be put. The directory will be created if missing. Defaults to './EXTRACTIONS/SSURGO/'.

force.redo

If an extraction for this template and label already exists, should a new one be created? Defaults to FALSE.

Details

get_ssurgo returns a named list of length 2:

  1. 'spatial': A Simple Feature of soil mapunits in the template, and

  2. 'tabular': A named list of data.frames with the SSURGO tabular data.

Value

A named list containing the 'spatial' and 'tabular' data.

Examples

## Not run: 
# Get the NRCS SSURGO data (USA ONLY)
SSURGO.MEVE <-
  get_ssurgo(
    template = FedData::meve,
    label = "meve"
  )

# Plot the VEP polygon
plot(meve)

# Plot the SSURGO mapunit polygons
plot(SSURGO.MEVE$spatial["MUKEY"],
  lwd = 0.1,
  add = TRUE
)

# Or, download by Soil Survey Area names
SSURGO.areas <-
  get_ssurgo(
    template = c("CO670", "CO075"),
    label = "CO_TEST"
  )

# Let's just look at spatial data for CO675
SSURGO.areas.CO675 <-
  SSURGO.areas$spatial[SSURGO.areas$spatial$AREASYMBOL == "CO075", ]

# And get the NED data under them for pretty plotting
NED.CO675 <-
  get_ned(
    template = SSURGO.areas.CO675,
    label = "SSURGO_CO675"
  )

# Plot the SSURGO mapunit polygons, but only for CO675
terra::plot(NED.CO675)
plot(
  SSURGO.areas.CO675$geom,
  lwd = 0.1,
  add = TRUE
)

## End(Not run)

Download and crop a shapefile of the SSURGO study areas.

Description

get_ssurgo_inventory returns a SpatialPolygonsDataFrame of the SSURGO study areas within the specified template. If template is not provided, returns the entire SSURGO inventory of study areas.

Usage

get_ssurgo_inventory(template = NULL, raw.dir)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping.

raw.dir

A character string indicating where raw downloaded files should be put. The directory will be created if missing.

Value

A SpatialPolygonsDataFrame of the SSURGO study areas within the specified template.


Download and crop the spatial and tabular data for a SSURGO study area.

Description

get_ssurgo_study_area returns a named list of length 2:

  1. 'spatial': A Simple Feature of soil mapunits in the template, and

  2. 'tabular': A named list of data.frames with the SSURGO tabular data.

Usage

get_ssurgo_study_area(template = NULL, area, date, raw.dir)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping. If missing, whose study area is returned

area

A character string indicating the SSURGO study area to be downloaded.

date

A character string indicating the date of the most recent update to the SSURGO area for these data. This information may be gleaned from the SSURGO Inventory (get_ssurgo_inventory).

raw.dir

A character string indicating where raw downloaded files should be put. The directory will be created if missing.

Value

A SpatialPolygonsDataFrame of the SSURGO study areas within the specified template.


Download and crop the Watershed Boundary Dataset.

Description

get_wbd returns an Simple Feature collection of the HUC 12 regions within the specified template.

Usage

get_wbd(
  template,
  label,
  extraction.dir = file.path(tempdir(), "FedData", "extractions", "nhd", label),
  force.redo = FALSE
)

Arguments

template

An Simple Feature or SpatRaster object to serve as a template for cropping.

label

A character string naming the study area.

extraction.dir

A character string indicating where the extracted and cropped NHD data should be put.

force.redo

If an extraction for this template and label already exists, should a new one be created?

Value

An sf collection of the HUC 12 regions within the specified template.


The boundary of Mesa Verde National Park

Description

A dataset containing the spatial polygon defining the boundary of Mesa Verde National Park in Montana.

Usage

meve

Format

Simple feature collection with 1 feature and a geometry field.


A basic plotting function for NHD data.

Description

This is more of an example than anything

Usage

plot_nhd(x, template = NULL)

Arguments

x

The result of get_nhd.

template

An Simple Feature or SpatRaster object to serve as a template for cropping.

Value

A ggplot2 panel of plots

Examples

## Not run: 
# Get the NHD (USA ONLY)
NHD <- get_nhd(
  template = FedData::meve,
  label = "meve"
)
NHD
NHD %>%
  plot_nhd(template = FedData::meve)

## End(Not run)

Turn an extent object into a polygon

Description

Turn an extent object into a polygon

Usage

polygon_from_extent(x, proj4string = NULL)

Arguments

x

An object from which an bounding box object can be retrieved.

proj4string

A PROJ.4 formatted string defining the required projection.

Value

A Simple Feature object.


Read a Tucson-format chronology file.

Description

This function includes improvements to the read.crn function from the dplR library. The principle changes are better parsing of metadata, and support for the Schweingruber-type Tucson format. Chronologies that are unable to be read are reported to the user. This function automatically recognizes Schweingruber-type files.

Usage

read_crn(file)

Arguments

file

A character string path pointing to a *.crn file to be read.

Details

This wraps two other functions: read_crn_metadata read_crn_data.

Value

A list containing the metadata and chronology.


Read chronology data from a Tucson-format chronology file.

Description

This function includes improvements to the read_crn function from the dplR library. The principle changes are better parsing of metadata, and support for the Schweingruber-type Tucson format. Chronologies that are unable to be read are reported to the user. The user (or read_crn) must tell the function whether the file is a Schweingruber-type chronology.

Usage

read_crn_data(file, SCHWEINGRUBER)

Arguments

file

A character string path pointing to a *.crn file to be read.

SCHWEINGRUBER

Is the file in the Schweingruber-type Tucson format?

Value

A data.frame containing the data, or if SCHWEINGRUBER==T, a list containing four types of data.


Read metadata from a Tucson-format chronology file.

Description

This function includes improvements to the read_crn function from the dplR library. The principle changes are better parsing of metadata, and support for the Schweingruber-type Tucson format. Chronologies that are unable to be read are reported to the user. The user (or read_crn) must tell the function whether the file is a Schweingruber-type chronology.

Usage

read_crn_metadata(file, SCHWEINGRUBER)

Arguments

file

A character string path pointing to a *.crn file to be read.

SCHWEINGRUBER

Is the file in the Schweingruber-type Tucson format?

Details

Location information is converted to decimal degrees.

Value

A data.frame containing the metadata.


Replace NULLs

Description

Replace all the empty values in a list

Usage

replace_null(x)

Arguments

x

A list

Value

A list with NULLs replaced by NA

Examples

list(a = NULL, b = 1, c = list(foo = NULL, bar = NULL)) %>% replace_null()

Get a logical vector of which elements in a vector are sequentially duplicated.

Description

Get a logical vector of which elements in a vector are sequentially duplicated.

Usage

sequential_duplicated(x, rows = FALSE)

Arguments

x

An vector of any type, or, if rows, a matrix.

rows

Is x a matrix?

Value

A logical vector of the same length as x.


Submit a Soil Data Access (SDA) Query

Description

soils_query submit an SQL query to retrieve data from the Soil Data Mart. Please see https://sdmdataaccess.sc.egov.usda.gov/Query.aspx for guidelines

Usage

soils_query(q)

Arguments

q

A character string representing a SQL query to the SDA service

Value

A tibble returned from the SDA service


Splits a bbox into a list of bboxes less than a certain size

Description

Splits a bbox into a list of bboxes less than a certain size

Usage

split_bbox(bbox, x, y = x)

Arguments

x

The maximum x size of the resulting bounding boxes

y

The maximum y size of the resulting bounding boxes; defaults to x

Value

A list of bbox objects


Convert a list of station data to a single data frame.

Description

station_to_data_frame returns a data.frame of the GHCN station data list.

Usage

station_to_data_frame(station.data)

Arguments

station.data

A named list containing station data

Details

This function unwraps the station data and merges all data into a single data frame, with the first column being in the Date class.

Value

A data.frame of the containing the unwrapped station data


Get the rightmost 'n' characters of a character string.

Description

Get the rightmost 'n' characters of a character string.

Usage

substr_right(x, n)

Arguments

x

A character string.

n

The number of characters to retrieve.

Value

A character string.


Unwraps a matrix and only keep the first n elements.

Description

A function that unwraps a matrix and only keeps the first n elements n can be either a constant (in which case it will be repeated), or a vector

Usage

unwrap_rows(mat, n)

Arguments

mat

A matrix

n

A numeric vector

Value

A logical vector of the same length as x


Strip query parameters from a URL

Description

Strip query parameters from a URL

Usage

url_base(x)

Arguments

x

The URL to be modified

Value

The URL without parameters