Tuesday, November 10, 2015

Supervised Classification



This map shows a supervised classification to identify land use and land cover in Germantown, MD for the purposes of tracking changes in population and land consumption.  An image of the area was used to create signatures to be used in classification, and based on those signatures, the image was classified and the signatures merged into 8 classes.  The area for each class was calculated so that, when compared with images from previous or later years, the changes can be calculated.  

Creating signatures requires paying attention to what type of process you use for each signature.  Sometimes it is better to draw a polygon, other times Growing a Seed is the preferred method.  In general, the second method seemed to be best, although for the Fallow Field 1 signature, the polygon was a good choice.  By looking at histograms of each signature, it was possible to see if there was a good enough sample of pixels, and if there was separation in the spectral signature between classes.  By using the Mean Plot it was possible to determine the best band combination to create separation between each spectral signature.

With more time, even better signatures could have been created, but the distance image shows there were not too many poorly classified areas.

Tuesday, November 3, 2015

Unsupervised Classification



This is an example of unsupervised classification. Here a Landsat image was processed so that clusters of similar pixels were grouped as classes, and then those classes were matched to various feature types in the original image.  The result is a thematic map representing, in this case, 5 classes.  The classes can be analyzed to answer questions; in this case the question was what percentage of the image represented impermeable versus permeable surface.  The necessary information was contained in the Attribute table for the newly created classification image.  One issue that came up was that some pixels were classified in more than one group - for example there were green grass pixels on the roofs of some buildings.  The solution was to create a "Mixed" class that would allow for those situations where classification into a clearly identified group was not possible.

Tuesday, October 27, 2015

Thermal & Multispectral Analysis



The three images above all show different ways in which a remotely sensed image can be viewed in order to examine different features of interest.  In this image showing part of the coast of Ecuador, a feature that seems to be fields is visible to the lower right of the island. Each image shows the same feature, but what each band combination highlights is different.  In the true color image, the colors are what would be "expected" - vegetation is green, bare earth is grey, urban areas are tan.  The smaller rivers and streams show variation in depth based on the lightness or darkness of the brown and grey hues.  When the band combination is changed to the configuration in the upper right image, much of the vegetation stands out as bright green, but several areas, including the focus feature, show up as blue.  This highlights the moisture content of various elements in the image, including all rivers and streams, as well as areas of agriculture that are moist.  The third image shows Layer 6 - the Thermal band, in one color ramp.  The feature stands out as both an area of vegetation, and an area of relatively high moisture content because of the radiant energy produced by water in comparison to other substances.

Tuesday, October 20, 2015

Multispectral Analysis


This week's lab focused on learning how to read and understand histograms, and understanding how to identify spectral characteristics of various features..  We also explored various band combinations that would highlight certain features within an image.  Below are three examples.

The first feature we were to identify had a spike in Layer 4 between pixel value 12 and 18.  These low numbers indicate that the feature absorbs rather than reflects EMR, and this image clearly shows that characteristic, especially when the colors are manipulated so that the water is especially dark, contrasted with the light green vegetation and pink/red ground.

The second feature we were looking for had to satisfy two different criteria - a spike in pixel values around 200 in Layers 1-4, indicating high reflectivity, and a spike in pixel values between 9 - 11 in Layers 5 & 6.  The snow-capped mountains have this pattern.  A bright, nearly electric blue makes this feature stand out.

Variations in this water feature are highlighted by using a band pattern that allows the vegetation to remain muted.  Attention is then focused on the variations in the water, showing sediment buildup and a clear channel leading out to a larger body of water.  

Tuesday, October 13, 2015

Image Enhancement


This map shows an image that was enhanced using both Imagine and ArcGIS.  The first problem was to reduce the visibility of striping that occurred in the Landsat7 image.  Fourier Transform is a process that reduces the banding, although it can still be seen.  Further use of filters and adjustment of the histogram improved the contrast and detail, allowing the viewer to be able to see edges more clearly, and to be able to identify areas of vegetation, water, and urban or residential land use.

Monday, September 28, 2015

Intro to ERDAS Imagine and Digital Data


This is a subset of an image we processed using ERDAS Imagine.  First we had to learn a bit about how to use this new program, which seems to have a huge range of capabilities.  After loading a raster image into the Viewer and making sure the display was set to Pseudo Color, we spent some time learning how to navigate around the image and comparing one image at a lower level of detail (AVHRR) with one a LANDSAT Thematic Mapper satellite image.  The first was classified, while the second was continuous - raw data from the sensor.  We set the preferences to enable Fit to Frame and Background Transparent, making sure the Clear Display option was off, and experimented with options in the Multispectral tab to see what combination of Bands enhanced which types of features, such as vegetation, rivers, urban areas, etc.  The third part of this lab was the creation of a map which we began in ERDAS Imagine, where we created an attribute to show the area of each class in the image, and then exported a subset of the map to make a map in ArcMap.  This was done using the Inquire Box and then creating a subset image.  In ArcMap we made Class_Name the value field, with Unique Values chosen so that the hectares for each class would be listed.  I chose to edit the description so that the information would remain with the data rather than being temporary. 

Clearly the ERDAS Imagine program is very useful and has huge capabilities, but is limited by a few bugs which is unfortunate.

Tuesday, September 22, 2015

Truthing for Accuracy

The red and green dots represent 30 sample points that were randomly selected in order to check the accuracy of how they were classified in last week's lab.  I located them in a way to reduce bias, by creating and locating them in roughly equal proportions according to each classification type. Then, using Street View in Google Maps, I examined each site and recorded whether it was accurate or not, and if it was not, what classification it should have been.  Finally, I calculated the % accuracy of the sample points.  Because we generalized in making the polygons and classifying them originally, the accuracy was not very high.  Sometimes the point landed on a structure that was commercial rather than residential, although overall the polygon seemed to be the correct type.  In one case, what had been a sandy area when the photograph was taken had been built up with several houses.  What I had thought might have been an academic complex turned out to be several businesses and the Jackson State Fairgrounds.  Several areas that seemed forested were actually residential.  A closer reading of the classification descriptions also prompted me to change one or two classifications.

Tuesday, September 15, 2015

Level II LULC Classification

 
 
Classification of land use and land cover was the focus of this week's lab.  An aerial photograph of an area of Pascagoula, MS was digitized to create a land use/land cover map.  Features were identified using tone, pattern, shadows, size and shape, and associations, the subject of last week's lab, and then polygons were drawn around those features and classified.  The USGS Standard Land Use/Land Cover Classification System we used has several levels - we worked at Level II.  To begin, Level I classifications were identified, such as Urban or Built-up Land, Forest Land, and Water.  Within each of those, Level II categories were located, such as Residential areas, Lakes, and Deciduous Forest Land.  To create the map, the polygons were labeled, a logical color scheme was chosen to highlight the various categories, and the other essential map elements were added.
 
From this map, it is clear that Level II classification is quite general, and that by generalizing, many specific features and land types are incorrectly identified.  Level III classification provides far better identification, but is obviously more time consuming.

Monday, September 7, 2015

Visual Interpretation of Aerial Photographs

This week we produced 2 maps showing some of the ways that aerial images can be interpreted.  The first map shows variations in tone and texture.  There are 5 variations for each scale, with each polygon enclosing one variation.

 
 
The second map uses 4 criteria to identify features: Shape and Size, Shadows, Patterns, and Association. 

Monday, August 3, 2015

Sharing Tools


The final week's lab required that we open and edit a script, updating it so that the hard-coded variables for input boundary feature and output file location parameters were set using the sys.argv[ ] code instead.  This code uses a number system beginning with 1 for the first parameter, rather than 0 which is where parameters set with GetParameter and GetParameterAsText functions use.

The next step was to edit the tool's description in ArcCatalog to make the tool more user-friendly and informative.  We added Dialog Explanations for each parameter, which shows up in the dialog box when using the tool as seen in the screenshot here on the left.  On the right is the map created by the tool which created random points and then put a buffer around each one.

The last step was to embed the script into the tool so it could be shared without adding a separate script file, and so that it could be protected with a password, which we did. 

Two other steps which are important are making sure that relative file paths are stored (check box), and that the .py extension is made visible in ArcCatalog, via the Options box.

Sunday, July 26, 2015

Module 10: Custom tools

 
The ability to turn a script into a custom tool was this week's topic.  To do that, the following basic steps were followed:

1.      Create and save a .py script
2.      Create a custom toolbox for storing it
3.      Add a script tool to the custom toolbox with the AddScript wizard, selecting the saved .py script
4.      Modify the code in the script so it can receive the parameters set by the tool dialog box
5.      Set up the parameters in the tool’s properties
6.      Edit script using GetParameterAsText and GetParameter functions to set up parameter Dialog box for script tool

 
Here is a screenshot of the parameter Dialog box that allows input to the script tool.  Here there are 4 parameters, two of them already showing default locations for input and output files.  These are examples of workspace type parameters.  Two feature class parameter types are also listed: clip boundary feature and input features.  The latter was set to have a MultiValue property, since more than one .shp was used in this situation. 
 
 
 
Before the script tool ran, we edited it once more, adding AddMessages statements so that they would print in the results window, as shown below.

 
This lab was especially useful in showing how all the work of creating scripts can be used in a more simple, direct, and integrated way in ArcMap.

Wednesday, July 22, 2015

Module 9: Rasters

Last week we worked with Vectors, this week we learned how to write scripts for Raster data.  The functions and classes found in the arcpy.sa module are used to list, describe, create, and modify rasters.

ListRaster is used to find out what rasters exist in a workspace - Esri GRID and geodatabase rasters do not have file extensions; .img, .tif, and .jpg are returned for other image formats.

Describe returns general and specific properties of raster datasets.  The elements that can be described are datasets, bands, and catalogs.  Different properties are available depending on which element is being described.

Once a raster object is created, it can be used in other Python statements and map algebra statements.  Properties include band count, cell height and width, spatial reference, pixel type, and more.  The raster object has only one method possible - save.  This is used to make a temporary raster object permanent after ArcMap has closed.

In order to be able to use the tools in the arcpy.sa module directly, all the functions can be imported.  This can be helpful especially when using map algebra operators.

Several classes in the arcpy.sa module were introduced, including Remap and Neighborhood.  There are 8 others.

This week's lab asked us to create a raster from 2 existing rasters: elevation and landcover.  To start, we wrote code to see if the spatial analyst extension was available, then checked out the spatial analyst extension.  Next we reclassified the landcover raster so that 3 forest landcover classifications were changed to 1.  Then we created temporary rasters for elevation that had slope between 5-20 degrees, and aspect between 150-270 degrees.  The combined temporary rasters were saved to a final raster in a newly created geodatabase, and the spatial analyst extension was checked back in.  Below is the result, seen in ArcMap.

Monday, July 20, 2015

Using GIS for landscape archaeology


Sam Turner, Jim Crow, “Unlocking Historic Landscapes: Two Pilot Studies Using Historic Landscape Characterisation” in Antiquity 84(2010): 216 – 229


This article explains how GIS is being used to recreate historic landscapes and to analyze and interpret their development.  Historic Landscape Characterisation (HLC) is the term for “mapping landscape with particular reference to its historic character and development”.  The archaeologists in this example are using GIS to explore two locations in the Aegean, focusing on field systems and how they evolved over time.

Recently there has been a shift to focus less on either of the two prevailing approaches to the study of past, present and future landscapes - economic/functional or social/symbolic, and instead to look at “multifunctionality”.  HLC uses satellite images, aerial photography, and historic maps to “map, analyze, compare and contrast the perceptions of a wide range of people working within the landscape” using GIS.

HLC is not new, but in the past data storage had been an issue when working with such large projects.  GIS provides a solution, allowing a range of map sources, the ability to adjust scale, and the analysis tools which enhance interpretation.   With GIS, features of the historical landscape at a particular period are bundled together, creating visible groupings and patterns that characterize that period.  It is possible to do retrogressive analysis, and to add explanatory text connected to a database.  Also, GIS is flexible and adaptable to projects and research questions of any size or topic.

The two examples of the use of GIS in HLC in this article are from Greece and Turkey.  In Greece, the fields are terraced, and modeling can expose patterns of land use over time.  In Turkey, coaxial fields were used, and again, maps show a chronological progression of land development.

I chose this article for this assignment because it is along the lines of a project I’d like to put together once I’ve learned enough to do it. 

 

 

 

Thursday, July 16, 2015

Working with Geometries

 
In Module 8, we explored how to work with geometries - points, lines and polygons.  The use of geometry tokens to simplify some operations was one topic; SHAPE@LENGTH and SHAPE@XY were two of the tokens we worked with. We also learned how to parse points and polylines in order to get at the values within the geometry. The last part of the lesson focused on reading and writing geometries.
 
Below are the results of a script we worked on this week.  Each line contains data about a point or vertex in a feature.  The shapefile that provided the data was of rivers in Hawaii.  The script retrieved information including the object ID, x and y coordinates, and name of the river for each point, then created a new text file and wrote the information to that text file.
 
 
 
 
I had some trouble with the last part of the lab, writing to the text file.  My two issues were that I didn't put the writing part of the script inside the most nested loop - I had it all the way out, not indented.  Also I was using double quotes around by variables, so I kept getting the name to print and not the values of the variables.

Wednesday, July 8, 2015

Cursors and Dictionaries


This week we investigated ways to explore and manipulate spatial data.  The Describe function allows the user to discover what data exists as well as to examine the properties of that data.  Tuples and dictionaries were introduced.  We then went on to learn how to manipulate data by using cursors to access and iterate over rows in a table, or to insert or delete records.  One use of the SearchCursor function we practiced in the lab was in creating a SQL statement, which in our case gave us the name and population of all county seats in New Mexico.  We used this table to create a dictionary of keys (cities that were county seats) and values (their population).  One other topic this week was working with text, which was not part of the lab.

This screenshot shows the results of a script which created a geodatabase, copied data (shapefiles) into that geodatabase, selected the name, feature and population fields of the cities feature class, then narrowed that down to just those cities which were county seats.  The last part of the script created a dictionary and populated it with this information and printed it.

It took a while to get the correct syntax for the SQL statement, since the correct type and number of brackets and quotation marks is critical.  However, I got stuck on the part of the code where the dictionary was to print the name and population.  Finally, with some excellent tips and guidance, I was able to make it work.

Monday, June 22, 2015

Python and Geoprocessing

 
 
This week's lesson focused on writing Python script for geoprocessing tools. While these tools can be used in ArcMap, writing custom scripts can increase the potential of tools. 

The lecture, readings, and exercise that prepared us to write our own script included material on importing Arcpy modules, classes, functions and tools.  We practiced setting a workspace, creating variables, working with parameters, learning syntax for commonly used code, working with messages, and using a few geoprocessing tools.  Then we wrote our own geoprocessing code in order to create a 1000 meter buffer around hospitals, dissolving the overlapping perimeter lines.  We also set the XY coordinates for each hospital. After each geoprocessing tool was run, a line of code asked for a message to be printed, allowing the user to see if each step had run successfully. The image above shows that the script ran successfully in PythonWin.  In ArcMap, running the script produced the new shapefiles and added them as layers to the map.

This week's reading assignment was Chapter 5, and for me it was the most essential one so far, giving me several "ah-ha" moments as I finally understood some of the concepts and language that have been frustrating me up to this point.  Up until now, I didn't really get how the code we were writing actually made things run.  Things are clicking a bit better now.

Thursday, June 18, 2015

Geoprocessing in ArcGIS

 
 
In Module 5, the lesson addressed the following objectives:
  • creating a toolbox
  • creating a tool using ModelBuilder
  • setting model parameters
  • exporting a script from ModelBuilder
  • updating a model-derived script to work outside of ArcMap
  • creating a script tool
The task was to use ModelBuilder to clip a "soils" layer to the extent of another layer, "basin", then to select soils within the clipped output that were not prime farmland and to erase them.  The image above shows the result.
 
Using ModelBuilder was straightforward and fun (compared to writing the script from scratch), and it helped me visualize the process so that when we did have to update the script to work outside of ArcMap, I actually understood what was written.  The three tools, Clip, Select, and Erase, were linked with inputs and outputs whose parameters we manipulated.  This also clarified the process for me. 


Thursday, June 11, 2015

Fixing things

 This is the result of running the first code once I had fixed a few syntax errors.  Little things like capitol vs. lower-case letters, or an added letter in a word make a difference, and code won't run if it's incorrect.
 This script had eight errors to fix.  I commented-out all but the top and ran through the code bit by bit, looking at the error messages in the interactive window and correcting the code based on the message.  Some errors were syntax errors, others were problems with objects and functions.
This script shows the result of using a try-except statement to catch an error.  It took a while to get the except message written and placed correctly, but it finally worked.

The lab this week was very helpful, and I'm glad it came earlier on in the course.  I wish I had learned it for last week's lab!

Tuesday, June 9, 2015

Using lidar to uncover archaeological landscapes at Angkor


Article: “Uncovering archaeological landscapes at Angkor using lidar”

Damian H. Evans, Roland J. Fletcher, Christophe Pottier, et al


While there has been a long history of archaeological investigation of the structures associated with Angkorian civilization, our understanding of the complex system of civic-ceremonial centers, high-density and low-density urban environments, water management systems, and agricultural space has been incomplete and flawed.  One major reason for this is the thick forest that covers the structural remains.  With the development of lidar, the vegetation is, in effect, removed.  This article describes the use of lidar and ArcGIS processing in Cambodia at the site of the medieval Khmer Empire complex of Angkor.

In 2012, a block of territory covering the forested area within most of the Angkor World Heritage site was scanned to map variations in surface topography in both horizontal and vertical planes.  Locations were recorded using GPS.  Data points were processed using a method specifically developed for archaeological surveys in forest environments.  These were processed into DEMs, hillshade models, and local relief models using ArcGIS, and there were then analyzed and interpreted.   Archaeologists then went out into the field to verify the results using data loaded into portable GIS units. 

To illustrate a portion of the analytical part of this process, the article includes two visual examples.  In Fig. 2 two layers are presented, one a digital orthophoto mosaic showing elevation from the lidar digital surface model, and a layer showing the extruded lidar DTM with 2x vertical exaggeration.  Modern roads and canals are also shown.  In Fig. 3, a lidar DTM, a conventional satellite image showing limited features due to the vegetation, and a map of previously documented archaeological features shows the extent to which lidar has revealed new structures.

This use of GIS processing, with the precision of GPS location technology and the ability of lidar to strip away vegetation and reveal even slight variations in the ground surface, has transformed our knowledge of the extent and layout of the Angkorian complex.  First, it revealed that the urban center was at least 35 square km in area, rather than the 9 square km that was previously thought to be its extent.  It also revealed that the urban landscapes are not confined within the enclosed or walled city, but extend far beyond.  Much of the extended urban landscape features conform to what was known to be a predictably aligned formation in a grid pattern, but lidar revealed some new types of urban features which have no apparent agricultural, occupational, or hydrological function.

As a result of this study, crucial areas have now been mapped and analyzed, and our understanding of the layout and extent of the Khmer civilization at Angkor, and the factors associated in its decline, has greatly increased.

Friday, June 5, 2015

If, then, while, else

If I had all the time in the world, then I would be able to handle this assignment much better.
If I had been taught even a little bit of coding in my educational past, then I would not be so confused and inept.
While I am in school this summer I will try my hardest, even though that isn't cutting it at the moment.
Else I will fail, and that is not an option.
Print result.

 
This week we learned about conditional statements and loops, and then got to try them out.  First we imported the module random, then fixed some code that was already written so that it printed out the loop above concerning dice being thrown.  Then we created a loop to print a random list of 20 integers between 0 - 10, and then another conditional statement and loop to remove a certain number from the list. 
 
Never have I spent so much time trying and trying and trying, all for only partial success, and that only with some coaching by a very helpful instructor.  I'm sure that if I had more time, and could really practice each thing we learned this week in small pieces, several times each with variations, this would make more sense, and perhaps I would have been successful at the last piece of code.  However, I ran out of time.  Life intervened.  The lawn had to be mowed before it rained again, and my kids needed their mother back.  So it goes.  I'll keep trying, for what it's worth.
 
 
 
 
 

Thursday, May 28, 2015

Mod2 - Last Name script

 
 
 
This week's task was to write a script. The screenshot above shows the result.  To make the script, several steps were taken which allowed us to practice some of the functions and methods we learned in this week's lecture and exercise, including writing a string, splitting it and creating a list, printing one item on the list using indexing, finding the length of my last name, and multiplying that number by 3.

Slowly, the fundamental concepts from this week's lesson are sinking in, and this lab was excellent practice.  It's certainly a challenge, but there is something satisfying when it works after trial and error.  The most frustrating thing is PythonWin, which refuses to do anything but flash briefly on the screen.  A nice person at the Help desk clued me in to a quick fix, which is to type wait = input ("PRESS ENTER TO CONTINUE) as the last line of code.  That's the only way I could test my script in PythonWin and see the result, or get a screenshot.  The Help desk person couldn't figure out what was going on with PythonWin, even by going on my computer remotely.  Apparently this is not an uncommon problem, since I found quite a bit on it by doing a quick web search.

Friday, May 15, 2015

Folders, by Python

Folders, by Python


This list of folders was created by running a script created by UWF staff in PythonWin.  In the lab we were supposed to be able to see the details of the script, by opening the .py file and choosing "Edit with Python Win.  Something wasn't working correctly for me, because although PythonWin opened, I was never able to view the script.  Clearly it ran, but I never got to check it out.  I'll see if I can make it work another time, but the window of opportunity for schoolwork has ended for me this week. 

Thursday, April 30, 2015

Final Project: Bobwhite-Manatee Transmission Line




For the past few weeks we have been using everything we've learned to create a PowerPoint presentation showcasing maps made with our new skills.  In this project, we were to present the best option for a corridor linking two electric substations with a new transmission line.  The study area included wetlands, conservation lands, and populated areas which needed to be taken into account.  Keeping costs low was another factor.

To begin, we created a model of the process we planned to follow to gather and analyze the data; it included possible GIS tools and methods that could be used.  Next we created maps based on criteria selected during the public input process.  Using GIS processes, the proximity of houses, schools and daycares was investigated.  Wetlands and conservation lands were identified so the impact of the transmission corridor could be minimized on these sensitive environmental areas.  Last, the length of the route was calculated.

Once the map were created, they were included in a Powerpoint presentation of the kind that would be given to a prospective employer.  Here is a link to the presentation and to the slide-by-slide report.

Presentation

Report

This was an extremely challenging  process, but one that felt, and probably was at least in part, realistic. What is presented here is my best effort, a good start.

Wednesday, April 29, 2015

Final Project

 



In our final project for this course we prepared a map for use in a newspaper article about high school seniors and college entrance scores. Two sets of data had to be presented on one map: test participation rates and average scores.


After preparing a basemap showing the United States and projecting it to an Albers Equal Area projection, since the statistics related to area, the data was tabulated in Excel. Then choices had to be made about how to present set of statistics. 


The data showing the percent of graduates tested by state is reflected by graduated symbols in 5 classes using the Natural Breaks classification method. The graduated symbols easily convey the differences visually, and 5 classes allows for a reasonable amount of variation per class. Average scores were displayed using a sequential color scheme, also with 5 classes using Natural Breaks classification. The graduated colors allow patterns to be easily discerned, and the classification method considers the distribution of data along the number line. Grouping similar data values together was desirable.
Although I am quite pleased with the results, I wish I had time to tweak a few more things.  However, yesterday I turned on my 4-month-old computer and found a blue screen.  It will be a week before it is fixed, or for a new computer to arrive. Here is what I could do with my son's laptop, with its tiny screen and missing "s" key.

This has been a fun, interesting, and challenging course.  I look forward to putting what I have learned to use, and to practicing the skills I have learned so far.  Clearly, we have just scratched the surface of what is possible.  Thank you, teachers and fellow classmates!

Friday, April 10, 2015

Georeferencing, Editing, and ArcScene


This week we made two maps.  One was a 3D map of the UWF Campus showing the georeferenced buildings and digitized road created in the other map.  The aerial maps, buildings, and roads were "draped" over a Digital Elevation Map to create the 3D effect on the land features, and then the building heights were further adjusted to create more of a visual effect.

The lower map shows UWF Campus buildings that have been georeferenced in two ways, creating slightly different RMS Errors but with overall visual accuracy for both.  Two types of editing were done - a polygon was created to show the UWF Gym, and a line was created to show Campus Lane.  We also created a Multiple Buffer Ring around an area designated to protect an eagle's nest.

Monday, April 6, 2015

KML files and Google Earth


Google Earth is amazing.  Like many people, I've spent quite a lot of time checking out places I've been and places I'd like to go.  This lab taught me that it's also an amazing tool for displaying data I choose to add.

This week we learned how to convert files from ArcMap to Google Earth format (KMZ files).  After converting the Dot Density map of S. Florida from last week's lab, we created a tour of 7 locations within that area, including Tampa Bay (above), as well as Miami, Ft. Lauderdale, Tampa and St. Petersburg.  As we added these stops to the tour, we used Google Earth tools to zoom in and around, choosing an interesting perspective for each.

Saturday, March 28, 2015

3D Mapping: Buildings in Boston



After completing the Esri Training Course "3D Visualization Techniques", we were asked to convert 2D features to 3D features using data gathered from LiDAR.

First, the Esri course taught us the vocabulary of 3D - multipatches, Z-values, terrain data sets, TINs, rendering, and extrusion, for example.  After examining the elements of 3D data, we learned how to define base heights, then set about practicing these skills using a map of Crater Lake National Park.  Base height was set for various data types, including an elevation raster, a 3D feature class, a line feature class, a point feature class, and a raster vegetation layer.  Next we learned how to enhance 3D views using Vertical Exaggeration and Illumination.  In the last section we learned how to generate 3D Objects from 2D Objects.  In the exercise, we extruded Buildings and Wells, then added an aerial photo draped over an elevation TIN.  Another step was Extruding Parcel Values so that these differences would be reflected in the height and color of the buildings.

This course provided the skills necessary to complete the final part of the lab, in which a map with a 2D shapefile of buildings in Boston was converted into the 3D map above.  After the two layers were added (the raster base layer and the shapefile buildings layer), the base height was calculated.  Since the raster layer had the elevation information, a new layer was created with random points for each building which had the elevation information added to them. The Mean Z Value was calculated in the SamplePoints layer, then joined to the buildings layer. Once the 2D data had "Z" values, it was possible to create a 3D map by extruding the building features.  For people without access to ArcGIS, these maps can be saved as a .kml file  and can be viewed instead using Google Earth, for example. 

Maps created in 3D are quite attractive when done well.  The example map of Napoleon's Moscow campaign is a fantastic example of how easy and fun it is to view and interpret one of these maps when it is done well.  Being able to fly around, zoom in, explore in a virtual world is exciting.  Still, at this point the ability of most people to access these maps is limited, and there seems to be the potential for bad data and poorly created maps, as there are in 2D.  That said, it seems clear to me that the abilities of a 3D map to convey information clearly, quickly, and accurately, are far above those of 2D maps.






Friday, March 27, 2015

Vector 2 Lab: Buffers and Overlays



This week we created a map of possible campsite locations within the De Soto National Forest while being introduced to buffering and overlaying.  These tools are commonly used, and are key to the process of answering questions about locations using specific criteria.

Buffering was used to find areas which satisfied two criteria for possible campsites - being within a certain distance from water features and roads.  A distance of 300m was set as the buffer for roads, 150m for lakes, and 500m for rivers. 

Overlays were used to join layers using one of six possible tools.  The "Union" tool was used to join the two buffered layers from the previous step.  This created a new layer showing areas which were within both the road buffer area and the water buffer layers.

The next step was to exclude conservation land.  This was done by using the overlay tool "Erase", which selected particular areas to remove from the "possible campsites" layer.  This was a multipart layer which we converted to a singlepart layer so that individual records could be accessed and manipulated within the Attribute Table.  In the lab, we looked at the attribute "Area" to see which possible campsites had the largest and smallest area. The total area in square meters was also calculated.

These two tools are clearly very powerful and useful in a large variety of situations, in many industries and occupations.  I'm excited to explore ways in which they are used in archaeology.

Friday, March 20, 2015

Dot Mapping South Florida's Population

 
 
 
Dot density mapping uses raw total data to display distribution patterns.  After carefully choosing an appropriate size and value, dots are distributed on the map in such a way that they most accurately reflect actual distribution on the ground.  Limitations can be placed on the data so that dots are not displayed in areas where they wouldn't actually be.  For example, population dots in the map above are not placed in lakes, ponds, marshes, etc.  Instead they are confined to areas selected specifically to match the type of data (population) that is the focus of the map - in this case, urban areas are an attribute that relates to population density and distribution, so using a mask, the dots were only included in urban areas. 
 
Dot size and value are also important aspects of making a map of this type.  The goal is to display the data in a way that enhances understanding of the data in an accurate way.  When the dots are too large or too small, patterns can be missed or misinterpreted.
 
This method is especially suited to some types of data.  The text illustrates this with maps of wheat harvests, and the map above with population data.  I have also seen it used successfully in archaeological contexts, and I hope to learn more about this in future courses.

Thursday, March 12, 2015

Flow Line Mapping

Lab 9: Map of Immigration to the U.S. using Flow Lines

 
 
This week we were asked to make a map in Corel Draw.  The focus of the lab was Flow Lines.  After choosing one of two base maps already made for us in ArcMAP,  I added curved lines using the Bezier tool, then modified them so that they fit the main criteria - not interfering with the features underneath.  To do this, I made them transparent.  I combined this effect with line colors to match each region.  The exception was Asia, which has a grey line  - yellow seemed too light and too similar to the green of Oceania's line.  Arrows were added to one end of the flow line and resized to fit. 
 
In Excel, a formula was used to find the proportional width of each flow line based on the number of immigrants coming from each region.  This way the flow lines visually represent not only direction of movement, but the number of people moving.
 
The last step was to create a legend and all the other usual map elements, and to experiment with some of the stylistic effects available in Corel.  Options included changing color schemes for the regions and U.S. States, making the flow lines transparent, and using the Drop Shadow, Extrude, and Bevel tools.  I chose to make my lines transparent and to use Drop Shadow on the map title. 
 
I think the most useful part of this lab was learning how to use the Bezier tool to make curved lines.  It isn't easy, in my opinion, and it took a lot of trial and error to get the 6 on my map to come out the way they did, but it will be a good skill to develop as curved lines are so common on maps.  I also struggled at first making the choropleth map legend, and at the end I had learned how to use the eye dropper to match fill color, and had worked with the rotate feature enough to get the hang of it.
 
All in all, it was a productive week and a fun, challenging lab.

Thursday, March 5, 2015

Data Search Lab: Palm Beach

Palm Beach County, Florida...



 
 
The results of two weeks' lab work are displayed above.  If someone had told me I'd be able to do this after 8 weeks of study, I wouldn't have believed them.  If I had another couple of weeks, the maps would look better  - but all things considered, I'm quite proud of what I accomplished.
 
This week's lab asked us to search for specific types of data, download the data, put all data in the same projection, clip it to be within a specified county (Palm Beach in my case), and then make everything look sharp.  Some of the data was vector, some was raster.  Some came from Labins, some from FGDL, some from USGS.  One thing that struck me during this process - there is a TON of data out there, and it's quite interesting to check it all out.  There will come a time when I have the luxury of doing that, but not right now!
 
The repetition of adding data, checking out the metadata, reprojecting, and clipping made me really understand the process.  I don't have to look through old lab notes to do it any more, and I call that progress. 
 


Tuesday, March 3, 2015

Precipitation Maps


The isarithmic precipitation maps we learned how to produce this week are familiar to most people, and it was interesting to see how one is put together after having looked at them so many times on the nightly weather report. 

We worked with raster data from the USDA that was interpolated using the PRISM Interpolation Method devised by people at Oregon State University.  Using the Spatial Analyst Tool we created two maps showing annual precipitation in Washington state.  One map shows the data as a continuous spread of color in a spectrum.  The transition between one tone and another isn't perceptible.  The other map displays the data with hypsometric tinting which creates a stepped appearance to the color spectrum.  Hillshade Effect was added to both maps to display the map in relief. This week we also learned how to make contour lines, using the Spatial Analyst tool "Contour List" where the values for each index contour line were entered, and then had a choice of which map should have contours added to it.


 
 

This was a fun lab, and I can see how useful this technique will be going forward.


Thursday, February 26, 2015

Europe: Men, Women, Wine

 
 
This week our task involved using what we learned last week about classification methods, then choosing an appropriate classification for each of three maps of Europe to show population density, males as a percentage of the population, and females as a percentage of the population.  We then added data on wine consumption per capita.
 
This was done in ArcMap.  After creating three data frames and adding the European Population data provided in the lab, I selected the appropriate Field Value, chose "Graduated Colors", and made layers for the assigned map topics.  Each had to be classified, and an appropriate scheme had to be chosen. 
 
Sequential color ramps in red and green classify male and female population percentages into 5 classes.  The contrasting colors work well next to each other, and 5 classes make the variation of color within the ramp easy to interpret. I chose a grey color ramp for the population density, and created 6 classes - the dark tones really stand out, which becomes even more important when the Wine Consumption symbols were added.  Proportional symbols were best suited to showing wine consumption per capita, creating an overall visual pattern that is easy to interpret.
 
Although I didn't have time to experiment with it this time, I look forward to making some customized color ramps sometime thanks to some tips from Lucas.

Saturday, February 21, 2015

Lab 6: Projections Part II



This map is the result of a huge number of steps and an enormous learning curve.  We learned how to access data online, download it, import it into ArcMap, and work with it to make sure all spatial references were defined.  One of the last steps was to make sure there was a common coordinate system and projection.  Many of the layers were in a different projection (Albers), it was necessary to reproject them so that they lined up correctly. 

Another important step was undertaken in Excel, where a formula was used to convert lat./long. with degrees/min./sec.  into decimal degrees.  Then the file, which specified the location of petroleum storage tanks in a monitoring program, was imported into ArcMap and saved as a shapefile.  Again, reprojection was necessary to make the STCM sites show up correctly.

Once the main map was constructed, the process of "owning my map" began.  I added two insert maps to show the location of the two aerial quads within Escambia County, and of Escambia County within Florida.  Because of the long shape of the quad map and the lack of white space on that map, I thought I'd place several of the other map elements on the right for balance. 

This was the most difficult and time-consuming lab so far, and completing it makes me feel like I have accomplished quite a lot.

Thursday, February 19, 2015

Lab 6: Four Data Classification Methods

 
 
 
This is an example of how different classification methods produce different results with the same data.  Here we see four possible ways of looking at the percentage of people over 65 in Escambia County, Florida.  While there are similarities among them all, each has its own spin on the information.
 
The Lab required that we create data frames of the same information, presented using the Natural Breaks, Equal Interval, Quantile, and Standard Deviation classification methods, then choose which one best represented the data and explain why.
 
To begin, I added the Escambia County shapefile to the first dataframe, then created 3 more dataframes and dragged the same shapefile into each one.  I renamed them according to each of the 4 classification methods.  Then I went to each layer's properties, chose the Symbology tab, and chose the Field Value PCT_65ABV to get the data about the percentage of population over 65.  Using Graduated Colors, I chose a ramp that would suit the data, ranging from light to dark.   I then selected one of the Classification types we were assigned, making sure there were 5 classes.  When the labels were created I formatted them to 2 decimal places. 
 
Once I had all the required data frames, I switched to Layout view and "owned my map" by including the usual required elements, and using the design concepts we have learned over the past few weeks.  At this point I realized the color ramp I had chosen for Standard Deviation didn't match the data - it went from light to dark, not reflecting the increased values as you move away from the mean.  I went back and chose a different one that has a light color for the mean, and darker colors at either limit.
 
After looking at all 4 classification methods, I decided the Standard Deviation method suited the data best.  The color scheme makes it clear where there is an average percentage of people over 65, and where there are far more and far fewer.  

Tuesday, February 10, 2015

Spatial Statistics Lab


The lab for Week 5 took us into the world of spatial statistics.  The Esri course we completed introduced topics that focused on looking at spatial relationships within a dataset, identifying patterns, and making choices about which analysis tools to use based on trends in the data.  Exploring the data is the first step.

Using a map of Europe showing weather monitoring stations, with temperature as the data to be analyzed, I looked at spatial relationships using Geostatistical Analyst tools.  I found the mean and median values, noticing if they were located close to one another.  By examining patterns in a histogram, a normal QQ plot, a Voronoi map, and a semivariogram cloud, I found outliers in the data.  However, overall there was a normal distribution of values. The last part of the process before selecting an appropriate analysis technique is to look at spatial trends.

As a result of exploring the data in these ways, the most appropriate choice for analysis seemed to be geostatistical interpolation.  This was a good fit because the data is normally distributed, stationary, autocorrelated, and has no local trends.  If a weather station needed to predict future temperatures, this technique would be useful.

This lesson probably only showed the tip of the iceberg in terms of what ArcGIS is capable of, and it has gotten me thinking about how it could be used for a particular archaeology project I'd like to do in the future.

Monday, February 9, 2015

Florida Three Ways


This week's lab was based on learning about projections, and using ArcGIS to reproject a data layer to a common Projected Coordinate System.  As the legends for each map indicate, there are slight differences in area between the three projections.

The process of actually making the map was easier this week - practice, practice, practice! My attempt at using Excel to make a chart showing the four area numbers was moderately successful, and certainly neater than using a text box, but lining everything up exactly right eluded me, even after over an hour of fine tuning the font, spacing between numbers, and text size.  The blue box wouldn't resize to exactly where I wanted it to go.  Still, the numbers do line up and the information is useful.

Wednesday, February 4, 2015

Week 4 Lab: Typography

      Cartography is definitely an art, and this week we focused on one aspect, typography.  With a base map of a portion of the Florida Keys centered around Marathon, we were asked to locate and label a list of geographic places, then complete the map with the usual required elements.  At the same time, we were asked to incorporate what we had learned about typography from Chapter 11 in the text, and from the lecture and tutorial material provided.
      After giving my map a horizontal orientation to match the lateral spread of the Keys, I went online and found all the required places, then made rough labels for each.  I focused on one category at a time, starting with the Keys, then cities, then the three specialized places (airport, park, and country club), and finally the bodies of water.  With each step, I tried to think of how to use font, size and positioning to make clear labels.  The airport, park and country club each got appropriate standard symbols, while the cities had traditional circle symbols.  I tried to be careful not to overlay labels on top of land.  When necessary, I used short lines to connect labels to their locations on the map.  The most challenging part of labeling was learning to get text to follow a path.  It does look nice when the water label follows the contours of the coastline.
      The last several hours were spent finding an appropriate map for the insert.  I kept on finding .gifs or  bitmaps, and all the .jpgs I found were too small so the pixels were enormous and the map was fuzzy.  Finally I got one, and once settled in the corner, it looked good, balanced against the Legend in the diagonal corner.
      I chose a dark blue background, and made the Keys stand out in a light grey-pinkish color.  Learning how to use fill was not easy at first, but I got the hang of it finally. 
      This week's lab was good practice at using CorelDraw, and at using graphic design elements to produce a basic, useful map.
      Seeing the map here, I guess I needed another border around the page.  This isn't what it looked like when I saw it in Paint.  There is room for improvement, but I'm still proud of my efforts, given my extremely limited previous experience in graphic design.


Sunday, February 1, 2015

Lab 4: Map Packages

 
This week we explored ArcGIS Online and learned about Map and Tile Packages through two Esri Training Courses. 
 
First we accessed a course about the possibilities of creating and sharing maps using ArcGIS Online.  This will be useful in the future when collaborating with others on a project.
 
Following that introduction, we moved on to a course on creating and sharing map packages (MPKs), at the end of which we were required to create and share two maps, one of climbing points in Yosemite National Park, and the other of a study area for ponderosa pines in the Aguirre Springs drainage region in New Mexico.  We practiced following the standard workflow (planning-data-cartographic design- share) and optimized the Aguirre Springs map package based on who would use the map, and for what specific purposes.  We modified symbols and scale, then shared the MPKs after completing the Item Description which included adding our name to the "Tags" and "Credit" sections.  The Aguirre Springs MPK also included a text tile as an additional document describing the map.
 
 

 
 
These screenshots show the two maps I shared on ArcGIS Online.  The process was challenging, but straightforward using the directions.  As with other courses I have taken through Esri (Intro to ArcGIS and part of another course), the training videos and readings were clearly written and easy to follow, and the exercises were very useful.  

Thursday, January 29, 2015

Cartographic Design Lab: Ward 7 Schools

This week we produced a map showing Elementary, Middle and High Schools in Ward 7 of Washington, D.C.   It included an insert map showing the location of Ward 7 within the city.  The focus of the lab was to use key design elements when producing a map.


I tried using Gestalt's Principles of visual hierarchy, contrast, figure-ground relationship, and balance.  By using larger, bright red symbols, schools become the emphasis. Ward 7 is easy to see because the light color shows up well against both grey and blue.  Streets are less important, so they have a light grey color and thin lines. I balanced the map by placing three similar-sized boxes containing the inset map, legend, and title in a roughly triangular formation.  Less important elements such as the north arrow, scale, data source and author/date are in smaller font with no borders.

It is easy to spend lots of time playing around with design elements - tweaking could be nearly endless if there weren't so many other things to do.  


Tuesday, January 27, 2015

Three maps of Mexico

This week we made 3 maps to practice what we learned in "Own Your Map", adding a few more skills along the way.


The topographical map was my favorite because of the ability to play around with the color options.  Some really made elevation clear, while others made the more subtle changes disappear.


This map showing population ranges within the states of Mexico was challenging because of problems I had making the ranges into round numbers.  I wasn't able to do it the way we were instructed because the numbers automatically changed to things other than what I typed.  In the end I changed them within the legend by renaming them.

This map taught me the most about how to change symbols, both on the map and within the legend.  It took a LONG time, but I'm happy with the result.


I'm still taking a long time with the labs.  The discussion group helped a lot this week with the resizing and neatline issues I was having.  Finding the data source in the metadata/item description eluded me.