In the latest Bayfield Training Webinar, Sonia Martin-Gutierrez, in collaboration with Dr. Seppe Cassetari: An introduction to geospatial integration. The webinar assessed how to make small adjustments to existing spreadsheets or financial databases by simply adding geospatial keys to link existing data to a map. The webinar explored some of the statistical and analytical tools you can apply to your data to understand the geographical context better. Additionally, it also highlighted how geospatial statistics could further enhance your understanding of a problem and some of the pitfalls that arise from inconsistent data of varying resolutions and age.
Geospatial Analysis—the ‘where’ question
Dr. Cassetari explains that the first natural question to geospatial analysis pertains to “where” the object you are analysing is located. The simplest form of analysis is the attribute table associated with the geospatial data. A simple analysis of attributes includes common spreadsheet functions (count, sum, average, mean…etc). Finally, Dr. Cassetari emphasizes that the quality and size of the attribute database is highly essential.
Data Analysis Techniques—2D
In the next slide, Dr. Cassetari explains how to analyze within a geospatial environment. Several forms of analysis include points by area and analyzing the relationship between two areas. To analyse points by area, one can use ‘count within’ and ‘count outside’ techniques. More advanced techniques include ‘clusters’ and ‘buffers’ (zones of uncertainty). To assess the relationship between two areas, one can apply Boolean logic. Boolean logic is especially useful in computing (or modelling) new attributes in topological overlay processing for vector and raster-based systems. Boolean logic is centered around three simple words (principles) known as Boolean Operators: “Or,” “And,” and “Not.” At the heart of Boolean Logic is the idea that all values are either true or false. The results are generally affected by data quality and different resolution boundary sets.
Further data analysis techniques
Next, Dr. Cassetari expands on further data analysis techniques. For example, network analysis includes techniques such as distance between, quickest route, and flow volumes. Networks are the precursor to 3-D analysis. Some networks model flow, such as channel capacity.
Moreover, several techniques use simple surface analysis, including cut and fill operations and volumetric studies. A surface is a vector or raster dataset that contains an attribute value for every locale throughout its extent. In a sense, all raster datasets are surfaces, but not all vector datasets are surfaces. Surfaces are commonly used in a geographic information system (GIS) to visualize phenomena such as elevation, temperature, slope, aspect, rainfall, etc. The ability to create a surface is a valuable tool in GIS.
Spatial analysis with CAD and BIM models
To conclude, Dr. Cassetari explains spatial analysis with different models. Beginning with Raster Data, Dr. Cassetari warns that the biggest mistake is to use data sets with different picture resolutions. By doing so, it “degrades the highest level of detail to the poorest level of detail.” There are several tools within the GIS environment, such as processing tools that allow you to extract data and use in other GIS analyses. Next, an essential step in classification is deciding and understanding how data is calibrated. For example, if you were trying to use remote sensing tools to identify all the tarmac in an aerial photograph you would need to consider all the different values that tarmac would have in an aerial photo. Ultimately, Dr. Cassetari explains that GIS tools can be used in various ways; moreover, it is crucial to have some idea of the expected result no matter the type of analysis.