Saturday, December 12, 2015

Spectral Signature Analysis

Goal and Background:
The goal of this lab was to show my ability to measure and interpret spectral reflectance of various Earth surface and near surface materials captured by satellite images. I collected spectral signatures from remotely sensed images, graphed them, and performed analysis on them to verify whether they pass the spectral separability test. 

Methods:
I opened a Landsat ETM+ image from 2000 of the Eau Claire area in Edras Imagine. I utilized the polygon tool under the Drawing tab to digitize an area in Lake Wissota. I then used the Signature Editor tool under Raster-Supervised.

In the Signature Editor I clicked Create New Signature from AOI. I changed its name and color and then displayed the mean plot window. I repeated these steps until I had spectral signatures for the following features:

  • Standing Water
  • Moving Water
  • Vegetation
  • Riparian Vegetation
  • Crops
  • Urban Grass
  • Dry Soil (uncultivated)
  • Moist Soil (uncultivated)
  • Rock
  • Asphalt highway
  • Airport runway
  • Concrete surface (Parking Lot)    
 
Results:

 Figure 1. This image demonstrates all of the spectral signatures collected together in the Signature Editor.


Figure 2. This image demonstrates the mean spectral signatures of the dry and moist soils. They follow each other almost parallel, but the difference between the bands is greatest around Band 5: 1.55-1.75 micrometers. This is likely due to the water in the moist soil, as water absorbs higher amounts of NIR, MIR, and FIR wavelengths.  



Figure 3. This image demonstrates all of the mean spectral signatures that I collected. Most of the vegetation is similar across the spectral channels, which makes sense because most plants absorb and reflect the same wavelengths of light. The asphalt, rock, soils, and runway are all fairly similar and reflect close to the same bands. This is most likely because they are made of similar materials and minerals. The water is the most different in its spectral signature because it reflects the least amount of infrared light. 


Sources:  

United States Geological Survey. (2000). Earth Resources Observation Science Center.

Thursday, December 3, 2015

Photogrammetry

Goal and Background:
The goal of this lab was to show my ability to perform different photogrammertric tasks on aerial photographs and satellite images. These tasks were accomplished through the understanding of the calculation of photographic scales, the measurement of areas and perimeters of features, and the calculating of relief displacement, as well as using stereoscopy and performing orthorectification. 

Methods:
Part 1: Scales, measurements, and relief displacement
Section 1: Calculating scale of nearly vertical aerial photographs
I was given the ground distance between point A and B to be 8822.47 ft. I measured the distance between those two points on an aerial image of the same area on my screen and determined the scale mathematically as follows:

 2 inches on the photo, 8822.47 feet in real world 
2in/(8822.47ft*12)=1in/X 
2X=105869.64in 
X=105869.64in/2 
X=52934.82in  
Scale= 1: 52934.82  

I was then given an image taken by a high altitude reconnaissance aircraft of Eau Claire County. The aircraft took the image at 20,000 ft about sea level with a focal length lens of 152 mm. The elevation of Eau Claire County is 796 ft. With this information I determined the photograph's scale mathematically as follows: 


f= 152 mm
H=20,000 feet asl
H= 796 feet

S= f/ (H-h)
S=152mm/(20,000ft-796ft)
S= 5.98 in/(240000in-9552in)
S=0.0000259in

25949/1000000000 =1/X
25949X=1000000000
X=1000000000/25949
X= 38537.13
Scale= 1:38537.13 


Section 2: Measurement of areas of feature on aerial photographs
I utilized the 'Measure' tool to draw a polygon around a feature in the provided image. After digitizing, I was able to determine the feature's perimeter and area in different units.

Section 3: Calculating relief displacement from object height
I was tasked with determining the relief displacement of a smoke stack in a aerial image. The height of the aerial camera above datum was 3980 ft and the scale of the aerial photograph is 1:3209. I determined the relief displacement mathematically as follows:

(hr*r)/ h 
(1604.5in*10.5in)/ (3980ft*12in) 
=0.352 in

Part 2: Stereoscopy
I used the tool Terrain-Anaglyph in Erdas Imagine and input an image with a 1 meter spatial resolution and another image that was the digital elevation model (DEM) of the same area with a 10 meter spatail resolution. I increased the vetical exaggeration to 2 and ran the model. Using polaroid glasses I was able to observe differences in the elevation characteristics of my anaglyph image product.  

Part 3: Orthorectification
Section 1: Create a new project
I opened LPS Project Manager through Toolbox-IMAGINE Photogrammetry. I created a new block file and chose Polynomial-based Pushbroom and SPOT Pushbroom as my Geometric Model Category. I set my projection type as UTM, a spheriod name of Clarke 1866, and a UTM Zone field of 11.

Section 2: Add imagery to the block and define sensor model
I added a frame to images folder and added my first image. I reviewed the sensor information of my image.

Section 3: Activate point measurement tool and collect GCPs
I used the Start point measurement tool and selected Classic Point Measurement Tool. I then input a reference image and used the viewer as a reference. I then collect 2 GCPs. After the first 2 GCPs I selected the Automatic (x,y) Drive icon. Then I collect more GCPs until I had 9 in total.

After my ninth GCP, I reset my horizontal reference sources to a different image and collected 2 more GCPs. I then chose the Reset Vertical Reference Source icon and chose DEM and set the DEM to my reference image file. I right-clicked the Point # column on the reference cell array and selected all. I then clicked on the update Z valueson selected points icon and updated them to match my reference image.
 
Section 4: Set type and usage, add a 2nd image to the block and collect GCPs
I left clicked the column labeled Type to highlight it and then right clicked it to access the column options and selected Formula. I then typed Full and applied the change. I repeated these steps for the Usage column, typing Control instead of Full. 

I then loaded my second image into LPS and checked its frame properties. I selected the Classic Point Measurement Tool and collected GCPs from my second image using my first image as the reference for the GCPs. I collected all the points the two images had in common.

Section 5: Automatic tie point collection, triangulation, and ortho resample
I used the Automatic Tie Point Generation Properties icon from the Point Measurement tool palatte and set the image used to all available and the initial type to Exterior/Header/GCP. I set the Intended Number of Points/Image to 40. After running I checked a number of points for accuracy.

In the Imagine Photogrammetry Project Manager I used Edit-Triangulation Properties and changed the Iterations with Relaxation value to 3. Under Ground Point Type and Standard Deviations I selected Same Weighted Values under the Type field. I then changed the X,Y, and Z fields to 15 and then ran the triangulation. I saved the report for future use.

I used the Start Ortho Resampling Process icon in the IMAGINE Photogrammetry Project Manager. I set the DTM Source to DEM and input my DEM image file. In the Output Cell Sizes field for X and Y I put 10. Under the Advanced Tab I verified that the resampling method was Bilinear Interpolation and then clicked the Add button. I input my second image and checked Use Current Cell Sizes. I then ran the Ortho Resampling.

Section 6: Viewing the orthorectified images
I brought both my images into Edras Imagine to view. 
 
Results:
 Figure 1. This image demonstrates the results from part 2. This image appears to be a fairly good representation of the elevation features of Eau Claire County. In some instances the trees and buildings appear to be much higher than they are in reality and this could be do to stereoscopic parallax that arose from using a DEM with a 10 meter spatial resolution and a base image with a 1 meter spatial resolution. 

Figure 2. This image demonstrates the results from part 3. There is a slight stair-step effect between the two images when the viewer is zoomed in close. The features of the images align nearly perfectly and there is only a thin black line between the two images when zoomed out further. 

Sources:  

Edras Imagine. (2009). Digital Elevation model (DEM) Palm Spings, CA.
Edras Imagine. (2009). Spot Satellite Images.
Erdas Imagine. (2009). National Aerial Photography Program (NAPP) 2 meter images.
United States Department of Agriculture . (2005). National Agriculture Imagery Program (NAIP).
United States Department of Agricultutre Natural Resources Conservation Services. (2010). Digital Elevation Model (DEM) for Eau Claire, WI.

Thursday, November 19, 2015

Geometric Correction

Goal and Background:
The goal of this exercise is to show my ability to use two different types of geometric correction techniques.

Methods:
Part 1: Image-to-map rectification
First I opened a satellite image of an area of Chicago and a map of the same area in Edras Imagine. I then used the Multispectral Control Points tool, using a polynomial model and collected my GCPs from the map image. I used the Create GCP tool to place my GCPs on the image and the map layer until I had four GCPs. I then adjusted my GCPs until my Control Point Error (Total) was less than 2.0. I then used the Display Resample Image Dialog tool to create my adjusted image, leaving the default settings the same. 

Part 2: Image to image registration
I opened a distorted image of Sierra Leone and a reference image in Edras Imagine.  I then used the Multispectral Control Points tool, using a polynomial model, changing the polynomial order to 3, and collected my GCPs from the reference image. I used the Create GCP tool to place my GCPs on the image and the map layer until I had 12 GCPs. I then adjusted my GCPs until my Control Point Error (Total) was less than 1.0. I then used the Display Resample Image Dialog tool to create my adjusted image, changing the resampling method to Bilinear Interpolation. 
 
Results:

Figure 1. This image demonstrates the GCPs I placed in part 1, with a total RMS Error of 1.6068.


Figure 2. This image demonstrates the results from part 1, and shows my resampled image based on image to map rectification that I conducted. The new image is much closer to how the map appeared and the distortion of many of the features has been lessened. 


Figure 3. This image demonstrates the GCPs I placed in part 2, with a total RMS Error of 0.5887.


Figure 4. This image demonstrates the results from part 2, and shows my resampled image based on image to image registration that I conducted. The new image is much a lot less contrast due to the bilinear interpolation resampling method I used. It also still appears distorted which could be due to the amount of RMS error I still had among the placement of GCPs. 

Sources:  
Satellite images are from Earth Resources Observation and Science Center, United States Geological Survey. Digital raster graphic (DRG) is from Illinois Geospatial Data Clearing House. 

Thursday, November 12, 2015

Lidar Remote Sensing

Goal and Background:
The goal of this exercise was to show my ability to: processes and retrieve various surface and terrain models and  process and create an intensity image and other derivative products from a point cloud.

Methods:
Part 1: Point Cloud Visualization in Edras Imagine
I opened the LAS dataset with ArcMap and used the label manager and a shapefile to determine the tile position of the files within the dataset.
 
Part 2: Generate a LAS dataset and explore Lidar point clouds with ArcGIS
Section 1: Create Folder Connection
I used ArcCatalog to created a LAS dataset containing  the Eau Claire data files. I ensured that the correct statistics were with the files and looked at the metadata to assign the proper horizontal and vertical coordinate systems. 
 
Part 3: Generation of Lidar derivative products
Section 1: Deriving DSM and DTM products from point clouds
I used the LAS Dataset to Raster tool in the ArcToolbox to create my digital surface model of the first return by setting the points tool to color for elevation and the filter for first return. I used a Binning interpolation method with a natural neighbor void filling and a 2 meter cell size. I then created a hillshade of the DSM by using the 3D Analyst Tools in the ArcToolbox.

I then used the LAS Dataset to Raster tool in the ArcToolbox again, this time setting the filter to GROUND with the point tool show points colored for elevation to create the DTM. I used a Binning interpolation method with a natural neighbor void filling and a 2 meter cell size. I then created a hillshade of the DTM by using the 3D Analyst Tools in the ArcToolbox.  

Section 2: Deriving Lidar Intensity image from point cloud
I set the LAS dataset to Points and the filter to first return. I then used the LAS Dataset to Raster tool in the ArcToolbox, setting the value field to INTENSITY, the Binning cell assignment to AVERAGE, the void fill to natural neighbor and the cell size to 2 meters.

Results:

Figure 1. This image demonstrates the results from part 3, section 1 and shows the hillshade image of the DSM.

Figure 2. This image demonstrates the results from part 3, section 2 and shows the intensity image obtained from the Eau Claire LAS dataset. 

Sources:  
All data was provided by Dr. Wilson with Lab 5. 

Thursday, October 29, 2015

Miscellaneous Image Functions

Goal and Background:
The goal of this exercise was to show my ability to: delineate a study area from a larger satellite image scene, optimize the spatial resolution of images for visual interpretation purposes, use some radiometric enhancement techniques in optical images, link a satellite image to Google Earth, to use various methods of resampling satellite images, to explore image mosaicking, and to use binary change detection through the use of simple graphical modeling. 

Methods:
Part 1: Image Subsetting of a Study Area
Section 1: To take my first subset of the Eau Claire area, I implemented the raster tool, inquire box. I created an inquire box around my selected area and then used the Subset & Chip-Create Subset Image tool to create my subset image. 

Section 2: To take my second subset of the Eau Claire area, I added a shapefile of Eau Claire and Chippewa Counties to the viewer with my input image. I then created an area of interest around the shape files and saved the layer as an AOI file. Then I employed the raster tool Subset & Chip and used the new AOI file I created to create my subset image. 
  

Part 2: Image Fusion
I employed the raster pan sharpen tool Resolution merge to execute image fusion. I input both a panchromatic band image and a multispectral band image into the tool. I employed a the multiplicative method and the nearest neighbor resampling technique. 
 
Part 3: Simple Radiometric Fusion Techniques
I employed the raster radiometric tool Haze reduction on my input image to reduce the haze and clouds in the image.

Part 4: Linking Image Viewer to Google Earth 
With the input image in Edras Imagine, I connected to Google Earth through Edras. I then matched Google Earth to my view screen and synced the two displays to make a viable interpretation key out of Google Earth for my input image.  

Part 5: Resampling
I used the raster spatial tool, Resample Pixel Size on my input image. I repeated this tool twice, once using the Nearest Neighbor approach and once using the Bilinear Interpolation to see the differences that occur. Both times I resampled the image from 30x30 meters to 15x15 meters and ensured that I kept my pixels square.  

Part 6: Image Mosaicking
I imported my input images ensuring that Multiple Images in Virtual Mosaic and Background Transparent were set.

Section 1: Image mosaic with the use of Mosaic Express
I employed the raster mosaic tool, Mosaic Express by inputting my two images in the correct order and running the program. 

Section 2: Image mosaic with the use of MosaicPro
I employed the raster mosaic tool, MosaicPro by adding my two input images into the tool, ensuring that Compute Active Area was set as the default. I corrected the order of the images within the tool and adjusted the radiometric properties by selecting histogram  matching and overlap areas as the settings. I then ran the mosaic.  

Part 7: Binary Change Detection
Section 1: Creating a difference image
I employed the raster functions tool, Two Image Functions, inputting a 2011 and 1991 multispectral images of the Chippewa Valley area. I ran this tool and then used the histogram and metadata to determine the cutoff points for the values that have changed between those two years. I used the equation mean + 1.5 *standard deviation. 

Section 2: Mapping change pixels in difference image using spatial modeler
I used the following equation to create a model to remove the negative values in my difference image: 
ΔBVijk = BVijk(1) – BVijk(2) + c 
Where:
ΔBVijk = Change pixel values. 
ΔBVijk(1)= Brightness values of 2011 image. 
BVijk(2) = Brightness values of 1991 image. 
c = constant: 127 in this case. 
I = line number 
J = column number 
K= a single band of Landsat TM.

I then employed Model Maker to create a model using band 4 of my 1991 and 2011 images and the equation to create a difference image. I then determined the change threshold using the equation: mean + (3*standard deviation). I then created another model using my difference image and Either IF OR function. I input: 
EITHER 1 IF ($n1_ec_91 > 202.1) OR 0 OTHERWISE

After running the model I input my difference image into ArcMAP to make more legible map. 

Results:
Figure 1. This image demonstrates the results from part 1, section 1 and shows a subset of an image using the inquire box. 
Figure 2. This image demonstrates the results from part 1 section 2 and shows the results of subsetting an image using a shape file. 
Figure 3. This image demonstrates the results from Part 2, displaying both the input and pansharpened images. 
Figure 4. This image demonstrates the results from part 5, displaying both the input image and the result after resampling using the nearest neighbor method. 

Figure 5.  This image demonstrates the results from part 5, displaying both the input image and the result after resampling using the bilinear interpolation method. 

Figure 6. This image demonstrates the results from part 6, section 1 after using Mosaic Express. 

Figure 7. This image demonstrates the results from part 6 section 2 after using MosaicPro.

Figure 8. This histogram demonstrates the results from part 7 section 1 and displays the cutoff points where changes have occurred between 1991 and 2011.  
 
Figure 9. This map demonstrates the results from part 7 section 2, where areas of the Chippewa Valley have appeared to have changed within band 4 between 1991 and 2011. 

Sources:  
All data was provided by Dr. Wilson with Lab 4.