2017 Nov 1
GridBounds.sizein favor of
GridBounds.coordsin favor of
GridBounds.bufferfor creating a modified
GridBoundsfrom an existing one.
ColorRamps.greyscale: Int => ColorRamp, which will generate a ramp when given some number of stops.
ConstantTile.fromBytesto create any type of
- Deprecation: The
LayerUpdatertrait hierarchy. Use
- Deprecation: Every cache provided by
geotrellis.spark.util.cache. These will be removed in favor of a pluggable cache in 2.0.
SpatialKey.extent: LayoutDefinition => Extent
TileLayerRDD.toSpatialReduce: ((V, V) => V) => TileLayerRDD[SpatialKey]for smarter folding of 3D tile layers into 2D tile layers.
- The often-used
applymethod overloads in
MapKeyTransformhave been given more descriptive aliases.
- Deprecation: The
- Library simplified by assuming the codec backend will always be Protobuf.
Finally, the full marriage of the
You can now transform an
RDD[Geometry] into a writable GeoTrellis layer of
val geoms: RDD[Geometry] = ... val celltype: CellType = ... val layout: LayoutDefinition = ... val value: Double = ... /* Value to fill the intersecting pixels with */ val layer: RDD[(SpatialKey, Tile)] with Metadata[LayoutDefinition] = geoms.rasterize(value, celltype, layout)
Polygonal Summaries over Time¶
The following was possible prior to GeoTrellis 1.2:
val layer: TileLayerRDD[SpatialKey] = ... val polygon: Polgyon = ... /* The maximum value within some Polygon overlaid on a Tile layer */ val summary: Double = layer.polygonalMaxDouble(polygon)
The above is also now possible for layers keyed by
SpaceTimeKey to form
a “time series”:
val layer: TileLayerRDD[SpaceTimeKey] = ... val polygon: MultiPolygon = ... /* The maximum value within some Polygonal area at each time slice */ val summary: Map[ZonedDateTime, Double] = layer.maxSeries(polygon)
ValueReader connects to some layer catalog and lets you read
individual values (usually Tiles):
import geotrellis.spark.io.s3._ val store: AttributeStore = ... val reader: Reader[SpatialKey, Tile] = S3ValueReader(store).reader(LayerId("my-catalog", 10)) val tile: Tile = reader.read(SpatialKey(10, 10))
.reader is limited to zoom levels that actually exist for the given layer.
Now you can use
.overzoomingReader to go as deep as you like:
import geotrellis.raster.resample._ val reader: Reader[SpatialKey, Tile] = S3ValueReader(store).overzoomingReader(LayerId("my-catalog", 20), Average) val tile: Tile = reader.read(SpatialKey(1000, 1000))
Regridding a Tile Layer¶
Have you ever wanted to “redraw” a grid over an established GeoTrellis layer? Say, this 16-tile Layer into a 4-tile one, both of 1024x1024 total pixels:
Prior to GeoTrellis 1.2, there was no official way to do this. Now you can use
/* The result of some previous work. Say each Tile is 256x256. */ val layer: TileLayerRDD[SpatialKey] = ... /* "Recut" the tiles so that each one is now 512x512. * No pixels are gained or lost, save some NODATA on the bottom * and right edges that may appear for padding purposes. */ val regridded: TileLayerRDD[SpatialKey] = layer.regrid(512)
You can also regrid to non-rectangular sizes:
val regridded: TileLayerRDD[SpatialKey] = layer.regrid(tileCols = 100, tileRows = 300)
Robust Layer Querying¶
It’s common to find a subset of Tiles in a layer that are touched by some given
val poly: Polygon = ??? val rdd: TileLayerRDD[SpatialKey] = layerReader .query[SpatialKey, Tile, TileLayerMetadata[SpatialKey]](Layer("name", zoom)) .where(Intersects(poly)) .result
Now you can perform this same operation with
MultiLine, and even
(Polygon, CRS) to ensure that your Layer and Geometry always exist in the
Tile ASCII Art¶
Sometimes you just want to visualize a
Tile without going
through the song-and-dance of rendering it to a
.png. The existing
Tile.asciiDraw method kind of does that, except its output is all
Tile.renderAscii: Palette => String method fulfills your heart’s desire:
import geotrellis.raster._ import geotrellis.raster.io.geotiff._ import geotrellis.raster.render.ascii._ val tile: Tile = SinglebandGeoTiff("path/to/tiff.tiff").tile // println(tile.renderAscii()) // the default println(tile.renderAscii(AsciiArtEncoder.Palette.STIPLED))
▚▖ ▚▚▜▚▚ ▚▖▚▜▚▖▚▚ ▜▚▚▚▜▚█▚▜▚█▚ █▚▜▖▜▖▚▚█▚▚▜▚█▖ ▚▚█▚▜▚▚▚▚▚▚▚▜▚▚▚▚▚ ▚▚▖▚▚▚▚▚█▜▚▚▜▚▚▖▚▖▚▖▚ ▚▚▚▚█▚▚▚▚▚██▚▚▚▜▖▖██▚▚▜▚ ▚▚█▚▚▚▚▚▚▚▜▚▚▚▚▚▚▜▚█▚▚▚▚▚▚▚ █▚▚▖▚█▚▜▚▚▚▚▖▚▚▚▚▚▚▚▚▚▚▜▚▚▚▚▚▚▖ █▚▚▚▜▚▖▚▚▚▚▚▚▚▚▚▚▚▚▚▚▚▚▚██▖▜▚█▚▚▚ █▚▚██▚▚▚▚▚▚▚▚▖▚▚▚▚▚▚▚▚█▚▚▚▚▚▚▖▖▖▚▚▚▚ █▜▚▚██▜▚▚▚▜▖▚▚▜▚█▜▚▚▚▜▚▖▚▜▚█▚▚▖▚▚▖▚▚▖▖▚▚ ▚▚█▚▚▚█▚██▚▚▚▚▚▚▚▚▜▚▚█▜▚▖█▚▚▚▜▚▚▚▚▚▚▜▚█▚█ █▚▜▚▜▚█▚▜▚▚▜▚█▚▚▚▚▚▚▚▚▚▚▚▖▚▖▚▚▖▚█▚█▚▚▚▖█▚ ████▚███▚▚▚▚██▚▚▚█▜▚▚▖▚▚▚▖▖▚▚▚▚▚▚▚▚█▚▜▖█ ▖█▜▚█▚██▜▖▜▜█▜▜█▜▚▚▚▚▚█▖▚▚▚▚█▚▚▚▚▚▚▜▚▚█▖▜ ▚▖██▚▜▚█▚▚▜▜█▜▜▜██▚▚▚▚█▚▚▚▜▖▚▚█▚▖▚▜▚▚▚▖▚█ █▚▚▚▚▜▚██▖██▜▚▚█▚▚▖▚▚▜▚▖▚▖▚▚▚▚▚▖▚▚▖▖▖▚▖▚ ▚▚▚█▚▚▚▚▚█▜▚▚████▚█▚▚▚█▚▖▚▚▚▖▚▚█▚▚▖▚▚▚▖▖▖ ▚▚▚█▚▚▚▖▖▚▜█▜██▜██▚▚▖██▜▚▜▚█▚▚▚▚▚▚▚▚▖▖▜██ ▚▚▚▚▜█▚▚▚▚▚█████▚▜██▚██▚▚▚▚▜▚▖▚█▚▚▖▚▖▚▚█ ▚▚▜▚▚▚▚▜▚▜▚▚▚▚▜▚█▚▜█▚██▚██▚▚▚▚▖▚▚▚▚▖▖▚▚▖█ ▚▜▚▜▚▚▚▚▚▚█▚▚▚▚▚██▜▜▜███▖▚▚▜█▚▚▖▚█▚▚█▚▖▚ ▚▜▚▚▚▚▚▚▚▚▚▚▜▜▜▚▚▖▚▖▚▚▜▜██▜▚██▚▚▚▚▚▚▖▜█▚ ▚▚▖▚▚█▚█▚▚▚█▚▖▚▚▚█▚▚▚▚▚▜██▚█▜▚█▚▜▚▚███▜█▜ ▚▚▚▜▚▚▚▚▚▚▚▚▚▚▚▖█▚█▚▚▜█▜█▜█▜▚▖▚▚▚██▜▜█▚▜ ▚▚▚▚▜▚▚▚▚▚▚▜▚▚▚▚▚▚▖▚█▜▖▖█▚▖▜▖▚▖█▚▖█▚▚▜▚█ ▚▚█▚▚█▚▚▜▚▚▚▚▜▚▚▚▚▚▜▚▖▚█▜█▚▜▜▚█▖▜███▜▚▚ ▚▚▚▚▚▚▖▜▚█▚▚▚▖▚▚▚▚▚▚▚▚▚▚▚▜█▖▜▜▜█▚▚▚▖▚█▚█ ▜▚▚▚█▚▖▚█▚█▚▚█▚▚▚▚▚▚▚▖▚▚▚▜▚▚▚▜▚▖▚▖▚▚▚▚▜▚ ▚▚▚▚▖▚█▖█▜▚▚▚▚▚▚▚▚▖▚▚▖▖█▚▜▚▖▚▚▚▚▖▖▚█▚▚▚ ▚▚▚▚▚▚▚▚▚█▚▚▚▖▚▚▚█▚▜▚█▚▚▖▜██▚▖▚▚▚▚▚▚▚▚▚▖ ▚▚▚▚▚▚▚▖▚▚██▚▚▚▚▚▚▚▚▜▚▚█▚██▚▚▚▚▖▚▚▖▚▚█▜▖ ▚▚▚▚▚▚▚▚▚▚▚▚▚█▚▜▚▚▚▜▚▚▖▚▚▚▚▚▜▚▚▚▚▖▚▚▚▚▚ ▚██▖▚▚▚▚▚▚▚▚▜▚▚█▚▚▚▚▜▚▚▚▚█▜▖▚▚█▜▜█▜█▚▖▚▖ ▚▚▚▖▚▚█▚▚▜███▚▚▚▜▚▚▚▚▚█▚▖▖█▖▚████▜███▚██ ▚█▚▚▚▚██▜▚▜▚▜▜▜█▜▚█▚▜▖▜▚▚▚█▚▜█▚▜▚▚▚▚▚▖▖ █▜█▚▚▜▚▜▚▜▜▜▚▚▚▚██▖▖▖▚██▖█▚▜▜▚▚▚▚▚▚▖ ▚█▜▜▜▜▜██▚▜▚▚▚▚▚▚▖▜▚▜▚▚▚▜▚█▚▚▖▖▖ ██▚▚▚▚▚▚▚▜▚▜▖▚██▜▜▚▖▚▚█▚▚▚▖▜▜ ▜▚▚▖▚▚▚▖▚▜▜██▜▜▚█▚▚▜▚▚▜██▚ ▚▚█▚▜▚▚█▖▜▚▚▚▖█▚▚█▚▚█▚ █▜▜▚▚▜▜▚▚▚▜█▚▚▚▜█▜█ ▚▚▖▚█▖▚▖▜▚▖▚▖▜▚ ███▖██▚▖▚▚▚▚ ▜▚▚█▚▚▖▖█ ▚▖▜█▜▚ ▖█▚
Storage on Azure via HDFS¶
By adding some additional configuration, you can now use our HDFS Layer Backend to read and write GeoTrellis layers to Microsoft Azure’s blob storage.
Configuring JTS Precision¶
GeoTrellis uses the Java Topology Suite for its vector processing. By default, JTS uses a “floating” PrecisionModel. When writing code that needs to be numerically robust, this default can lead to Topology Exceptions.
- Negative grid bounds bug
- getSignedByteArray BugFix - fixes certain read problems
- Allow Merge Queue To Handle Larger Inputs
- Generate Windows That Conform To GeoTiff Segments
- Removed inefficient LayerFilter check
- Fixed issue with S3 URI not having a key prefix
- Improve S3 makePath function
- Fix S3GeoTiffRDD behavior with some options.
- Allow Contains(Point) query for temporal rdds
- Haversine formula fix
- Use Scaffeine instead of LRU cache in HadoopValueReader
- Fix GeoTiffInfo serialization issues
- Estimate partitions number based on GeoTiff segments
- Estimate partitions number basing on a desired partition size
- Pyramid operation preserves partitioning
- Don’t constrain GridBounds size to IntMax x IntMax
- 4-Connected Line Drawing
- Added requirement for CRS implementations to provide a readable toString representation.
- Allow rasterizer to store Z value at double precision
- Changed scheme path file from /User -> current working dir
- Fix CRS parser and proj4 cea projection support
- Spark Enabled Cost Distance
- Conforming Delaunay Triangulation
- Added a fractional-pixel rasterizer for polygons and multipolygons
- Added collections API mapalgebra local and masking functions
- Added withDefaultNoData method for CellTypes
- Moved Spark TestEnvironment to spark-testkit subproject for usage outside of GeoTrellis
- Add convenience overloads to GeoTiff companion object
- Added matplotlib’s Magma, Inferno, Plasma, and Viridis color ramps
- Added library of land use classification color maps.
- Add MGRS encode/decode support to proj4
- Rasters write support to HDFS / S3
- Added Range-based reading of HTTP resources
- Improved the WKT parser that powers the WKT.getEpsgCode method
- Updated the geotrellis-geowave subproject to GeoWave 0.9.3
- Updated the geotrellis-geomesa subproject to GeoMesa 1.2.7
- Use H3 rather than Next Fit when building S3 partitions from paths
- Added delimiter option to S3InputFormat and S3GeoTiffRDD.
- Signed S3 Streaming for GeoTiff reader (HTTP with GET instead of HEAD request)
- Relaxed constraints to improve layer deletion capabilities
- Allow HadoopGeoTiffRDD and S3GeoTiffRDD to maintain additional key information such as file name
- Added API sugar for simplying construction of AvroRecordCodec
- Make compression optional for Avro encoding and decoding
- Optimization to avoid unspecialized Function3 usage in Hillshade, Slope and Aspect
- Updated multiple dependencies
- Upgraded ScalaPB version for VectorTile
- Added Avro codecs for ProjectedExtent and TemporalProjectedExtent and ConstantTile types
- Repartition in ETL when re-tiling increases layer resolution
- In GeoTiff reader, compute CellSize from TIFF tags
- Improved apply methods for constructing S3RangeReader
- Reorganized handling of CellType.name
- Documentation improvements, including porting the docts to reStructuredText
- Added top-level “Sinusoidal” CRS, commonly used with MODIS
- Added conditional to key bounds decomposition to detect full bounds query in Acccumulo.
- Support for the ability to specify output CRS via proj4 string.
- Fixed issues that made GeoTiff streaming off of S3 slow and broken
- Give a better error message for CRS write failures
- Fix clipping logic during polygon layer query
- Fixed type for CRS authority in NAD83
- Moved JsonFormats for CellSize and CellType to their proper place
- Fixed polygon rasterization for complex polygon test cases
- Fixed issue with FileLayerDeleter
- Fixed issue with logger serialization
- Fixed bug in renderPng that caused incorrect rendering of non-floating-point rasters
- Don’t allow illegal TileLayouts
- Prevent error from happening during Pyramiding
- Ensure tile columns are not zero when rounding
- Fixed malformed XML error that was happening after failed S3 ingest
- Fix issue with S3LayerDeleter deleting files outside of layer
- Fix TemporalProjectedExtentCodec to handling proj4 strings when CRS isn’t available
- Fixed layoutForZoom to allow 0 zoom level
- Fixed MapKeyTransform to deal with points north and west of extent
- Fixed GeoTiff reading for GeoTiffs with model tie point and PixelIsPoint
- Fixed issue with reading tiny (4 pixel or less) GeoTiffs
- Fix usage of IntCachedColorMap in Indexed PNG encoding
- Ensure keyspace exists in CassandraRDDWriter
- Resolved repartitioning issue with HadoopGeoTiffRDD
- Fixed schema for intConstantTileCodec
- In HadoopAttributeStore, get absolute path for attributePath
- In AccumuloLayerDeleter, close batch deleter
- S3InputFormat - bucket names support period and dashes
- Fix TMS scheme min zoom level
- S3AttributeStore now handles ending slashes in prefix.
- Cell type NoData logic for unsigned byte / short not working properly
- CellSize values should not be truncated to integer when parsing from Json.
- Fixes to GeoTiff writing with original LZW compression.
- In ArrayTile.convert, debug instead of warn against floating point data loss.
- Fixes incorrect metadata update in a per-tile reprojection case
- Fix issue with duplicate tiles being read for File and Cassandra backends
- Move to a different Json Schema validator
- S3InputFormat does not filter according to extensions when partitionCount is used
- In S3GeoTiffReader, partitionBytes has no effect if maxTileSize is set
- Fixes typos with rasterizer extension methods
- Fix writing multiband GeoTiff with compression
- Fixed issue with BigTiff vs non-BigTiff offset value packing
While we are trying to stick strictly to SemVer, there are slight API changes in this release. We felt that while this does break SemVer in the strictest sense, the change were not enough to warrant a 2.0 release. Our hope is in the future to be more cognizant of API changes for future releases.
Made EPSG capatilization consistent in method names:
Changed some internal but publicly visible classes dealing with GeoTiff reading
foreachon SegmentBytes with
getSegments, which the caller can iterate over themselves
- Changed some interal but publicly visible implicit classes and read methods around TiffTagReader
- Added as an implicit parameter to multiple locations, most publicly in TiffTagReader.read(byteReader: ByteReader, tagsStartPosition: Long)(implicit ttos: TiffTagOffsetSize). Also changed that method from being generic to always taking a Long offset.
Moved some misplaced implicit JsonFormats
CellSizeFormatfrom `` geotrellis.spark.etl.config.json`` in the
Changed LazyLogger from the com.typesafe.scalalogging version to our own version
- This shouldn’t break any code, but technically is an API change.
- GeoTools support
- Streaming GeoTiff reading #1559
- Windowed GeoTiff ingests into GeoTrellis layers, allowing users to ingest large GeoTiffs #1763
- GeoWave Raster/Vector support (experimental)
- GeoMesa Vector support (experimental)
- Create GeoMesa suproject #1621
- Moved to a JSON-configuration ETL process
- Vector Tile reading and writing, file-based and as GeoTrellis layers in RDDs. #1622
- File Backends
- Collections API #1606
- Added Focal calculation target type #1601
- Euclidean distance tiles #1552
- Spark, Scala and Java version version support
- Color correction features:
CollectNeighborsfeature, allowing users to group arbitrary values by the neighbor keys according to their SpatialComponent #1860
- Documentation: We moved to ReadTheDocs, and put a lot of work into making our docs significantly better. See them here.
- Documentation improvements
- Added example for translating from
- doc-examples subproject; example for tiling to GeoTiff #1564
- Added example for focal operation on multiband layer. #1577
- Projections, Extents, and Layout Definitions doc #1608
- Added example of turning a list of features into GeoJson #1609
- Added example for translating from
- Introduce ADR concept
- Parallelize reads for S3, File, and Cassandra backends #1607
- Kernel Density in Spark
- k-Nearest Neighbors
- Updated slick
- Added GeoTiff read/write support of TIFFTAG_PHOTOMETRIC via
- Added ability to read/write color tables for GeoTIFFs encoded with palette photometric interpretation #1802
ColorMapto String conversion #1512
- Add split by cols/rows to SplitMethods #1538
- Improved HDFS support #1556
- Added Vector Join operation for Spark #1610
- Added Histograms Over Fractions of RDDs of Tiles #1692
withNoDatamethods to Tile #1702
- Changed GeoTiff reader to handle BigTiff #1753
BreakMapfor reclassification based on range values. #1760
- Allow custom save actions on ETL #1764
- Multiband histogram methods #1784
DelayedConvertfeature, allowing users to delay conversions on tiles until a map or combine operation, so that tiles are not iterated over unnecessarily #1797
- Add convenience overloads to GeoTiff companion object #1840
Fixes / Optimizations¶
- Fixed GeoTiff bug in reading NoData value if len = 4 #1490
- Add detail to avro exception message #1505
- Fix: The toSpatial Method gives metadata of type
Intersects(polygon: Polygon)in layer query #1644
- Make regex for s3 URLs handle s3/s3a/s3n #1652
- Fixed metadata handling on surface calculation for tile layer RDDs #1684
- Fixed reading GeoJson with 3d values #1704
- Fix to Bicubic Interpolation #1708
- Fixed: Band tags with values of length > 31 have additional white space added to them #1756
- Fixed NoData bug in tile merging logic #1793
- Fixed Non-Point Pixel + Partial Cell Rasterizer Bug #1804
- PR #1611 Any
Tiles can utilize Polygonal Summary methods. (@fosskers)
- PR #1573 New
MultibandTilewhich maps over each band at once. (@hjaekel)
- PR #1600 New
mapBandsmethod to map more cleanly over the bands of a
- PR #1561 Fix to polygon sequence union, account that it can result in NoResult. (1)
- PR #1585 Removed warnings; add proper subtyping to GetComponent and SetComponent identity implicits; fix jai travis breakage. (1)
- PR #1569 Moved RDDLayoutMergeMethods functionality to object. (1)
- PR #1494 Add ETL option to specify upper zoom limit for raster layer ingestion (@mbertrand)
- PR #1571 Fix scallop upgrade issue in spark-etl (@pomadchin)
- PR #1543 Fix to Hadoop LayerMover (@pomadchin)
Special thanks to new contributor @mbertrand!
- PR #1451 Optimize reading from compressed Bit geotiffs (@shiraeeshi)
- PR #1454 Fix issues with IDW interpolation (@lokifacio)
- PR #1457 Store FastMapHistogram counts as longs (@jpolchlo)
- PR #1460 Fixes to user defined float/double CellType parsing (@echeipesh)
- PR #1461 Pass resampling method argument to merge in CutTiles (1)
- PR #1466 Handle Special Characters in proj4j (@jamesmcclain)
- PR #1468 Fix nodata values in codecs (@shiraeeshi)
- PR #1472 Fix typo in MultibandIngest.scala (@timothymschier)
- PR #1478 Fix month and year calculations (@shiraeeshi)
- PR #1483 Fix Rasterizer Bug (@jamesmcclain)
- PR #1485 Upgrade dependencies as part of our LocationTech CQ process (1)
- PR #1487 Handle entire layers of NODATA (@fosskers)
- PR #1493 Added support for int32raw cell types in CellType.fromString (@jpolchlo)
- PR #1496 Update slick (@adamkozuch, @moradology)
- PR #1498 Add ability to specify number of streaming buckets (@moradology)
- PR #1500 Add logic to ensure use of minval/avoid repetition of breaks (@moradology)
- PR #1501 SparkContext temporal GeoTiff format args (@echeipesh)
- PR #1510 Remove dep on cellType when specifying layoutExtent (@fosskers)
- PR #1529 LayerUpdater fix (@pomadchin)
Special thanks to new contributors @fosskers, @adamkozuch, @jpolchlo, @shiraeeshi, @lokifacio!
The long awaited GeoTrellis 0.10 release is here!
It’s been a while since the 0.9 release of GeoTrellis, and there are many significant changes and improvements in this release. GeoTrellis has become an expansive suite of modular components that aide users in the building of geospatial application in Scala, and as always we’ve focused specifically on high performance and distributed computing. This is the first official release that supports working with Apache Spark, and we are very pleased with the results that have come out of the decision to support Spark as our main distributed processing engine. Those of you who have been tuned in for a while know we started with a custom built processing engine based on Akka actors; this original execution engine still exists in 0.10 but is in a deprecated state in the geotrellis-engine subproject. Along with upgrading GeoTrellis to support Spark and handle arbitrarily-sized raster data sets, we’ve been making improvements and additions to core functionality, including adding vector and projection support.
It’s been long enough that release notes, stating what has changed since 0.9, would be quite unwieldy. Instead I put together a list of features that GeoTrellis 0.10 supports. This is included in the README on the GeoTrellis Github, but I will put them here as well. It is organized by subproject, with more basic and core subprojects higher in the list, and the subprojects that rely on that core functionality later in the list, along with a high level description of each subproject.
- Represent a Coordinate Reference System (CRS) based on Ellipsoid, Datum, and Projection.
- Translate CRSs to and from proj4 string representations.
- Lookup CRS’s based on EPSG and other codes.
(x, y)coordinates from one CRS to another.
- Provides a scala idiomatic wrapper around JTS types: Point, Line (LineString in JTS), Polygon, MultiPoint, MultiLine (MultiLineString in JTS), MultiPolygon, GeometryCollection
- Methods for geometric operations supported in JTS, with results that provide a type-safe way to match over possible results of geometries.
- Provides a Feature type that is the composition of a geometry and a generic data type.
- Read and write geometries and features to and from GeoJSON.
- Read and write geometries to and from WKT and WKB.
- Reproject geometries between two CRSs.
- Geometric operations: Convex Hull, Densification, Simplification
- Perform Kriging interpolation on point values.
- Perform affine transformations of geometries
- GeometryBuilder for building test geometries
- GeometryMatcher for scalatest unit tests, which aides in testing equality in geometries with an optional threshold.
- Provides types to represent single- and multi-band rasters, supporting Bit, Byte, UByte, Short, UShort, Int, Float, and Double data, with either a constant NoData value (which improves performance) or a user defined NoData value.
- Treat a tile as a collection of values, by calling “map” and “foreach”, along with floating point valued versions of those methods (separated out for performance).
- Combine raster data in generic ways.
- Render rasters via color ramps and color maps to PNG and JPG images.
- Read GeoTiffs with DEFLATE, LZW, and PackBits compression, including horizontal and floating point prediction for LZW and DEFLATE.
- Write GeoTiffs with DEFLATE or no compression.
- Reproject rasters from one CRS to another.
- Resample of raster data.
- Mask and Crop rasters.
- Split rasters into smaller tiles, and stitch tiles into larger rasters.
- Derive histograms from rasters in order to represent the distribution of values and create quantile breaks.
- Local Map Algebra operations: Abs, Acos, Add, And, Asin, Atan, Atan2, Ceil, Cos, Cosh, Defined, Divide, Equal, Floor, Greater, GreaterOrEqual, InverseMask, Less, LessOrEqual, Log, Majority, Mask, Max, MaxN, Mean, Min, MinN, Minority, Multiply, Negate, Not, Or, Pow, Round, Sin, Sinh, Sqrt, Subtract, Tan, Tanh, Undefined, Unequal, Variance, Variety, Xor, If
- Focal Map Algebra operations: Hillshade, Aspect, Slope, Convolve, Conway’s Game of Life, Max, Mean, Median, Mode, Min, MoransI, StandardDeviation, Sum
- Zonal Map Algebra operations: ZonalHistogram, ZonalPercentage
- Operations that summarize raster data intersecting polygons: Min, Mean, Max, Sum.
- Cost distance operation based on a set of starting points and a friction raster.
- Hydrology operations: Accumulation, Fill, and FlowDirection.
- Rasterization of geometries and the ability to iterate over cell values covered by geometries.
- Vectorization of raster data.
- Kriging Interpolation of point data into rasters.
- Viewshed operation.
- RegionGroup operation.
- Build test raster data.
- Assert raster data matches Array data or other rasters in scalatest.
- Generic way to represent key value RDDs as layers, where the key represents a coordinate in space based on some uniform grid layout, optionally with a temporal component.
- Represent spatial or spatiotemporal raster data as an RDD of raster tiles.
- Generic architecture for saving/loading layers RDD data and metadata
to/from various backends, using Spark’s IO API with Space Filling
Curve indexing to optimize storage retrieval (support for Hilbert
curve and Z order curve SFCs). HDFS and local file system are
supported backends by default, S3 and Accumulo are supported backends
- Query architecture that allows for simple querying of layer data by spatial or spatiotemporal bounds.
- Perform map algebra operations on layers of raster data, including all supported Map Algebra operations mentioned in the geotrellis-raster feature list.
- Perform seamless reprojection on raster layers, using neighboring tile information in the reprojection to avoid unwanted NoData cells.
- Pyramid up layers through zoom levels using various resampling methods.
- Types to reason about tiled raster layouts in various CRS’s and schemes.
- Perform operations on raster RDD layers: crop, filter, join, mask, merge, partition, pyramid, render, resample, split, stitch, and tile.
- Polygonal summary over raster layers: Min, Mean, Max, Sum.
- Save spatially keyed RDDs of byte arrays to z/x/y files into HDFS or the local file system. Useful for saving PNGs off for use as map layers in web maps or for accessing GeoTiffs through z/x/y tile coordinates.
- Utilities around creating spark contexts for applications using GeoTrellis, including a Kryo registrator that registers most types.
- Utility code to create test RDDs of raster data.
- Matching methods to test equality of RDDs of raster data in scalatest unit tests.
- Save and load layers to and from Accumulo. Query large layers efficiently using the layer query API.
Save and load layers to and from Casandra. Query large layers efficiently using the layer query API.
- Save/load raster layers to/from the local filesystem or HDFS using Spark’s IO API.
- Save spatially keyed RDDs of byte arrays to z/x/y files in S3. Useful for saving PNGs off for use as map layers in web maps.
- Parse command line options for input and output of ETL (Extract, Transform, and Load) applications
- Utility methods that make ETL applications easier for the user to build.
- Work with input rasters from the local file system, HDFS, or S3
- Reproject input rasters using a per-tile reproject or a seamless reprojection that takes into account neighboring tiles.
- Transform input rasters into layers based on a ZXY layout scheme
- Save layers into Accumulo, S3, HDFS or the local file system.
- Read geometry and feature data from shapefiles into GeoTrellis types using GeoTools.
- Save and load geometry and feature data to and from PostGIS using the slick scala database library.
- Perform PostGIS
ST_operations in PostGIS through scala.