Friday, March 18, 2016

Python Blog

Python Blog

In this blog I will keep an updated page showing all of the python that I use throughout the spring semester in GIS 2 Geog 337. 

The first python script that I created was for our Exercise 5 assignment. The purpose of the script was to take the rasters that we downloaded from an internet source and put them into a geodatabase. We had three rasters and all of them came from different sources. This means that they were all in different coordinate systems. Because of this the script needed to project each raster. The next step was to clip the raster. We had a study area of Trempealeau County in Wisconsin so we clipped each raster to the county boundary shape. The last step was to load the clipped and projected raster into the geodatabase. Below is a screenshot of the final script that successfully ran and accomplished all of the above tasks. 










Exercise 7 Python Script

The purpose of this script was to prepare our data for part two of exercise 7. The goal of this script was to select all of the mines that will be used in network analysis in part 2. To set up the data we needed to locate the mines that meet the following criteria:

- The mine must be active
- The mine must not also have a rail loading station on-site. 
- The mine must not be located within 1.5 km of a railroad 

In order to find this data using Python, it was necesary to set up SQL statements that would select the mines by being active and not having a railroad on site. The next major step was then to select these mines by location to make sure that they were not within 1.5 km of a rail road. My end results came up with a total of 44 mines that were not within 1.5 km of a railroad. Once the script ran, I went into ArcMap to explore the data to see if the script ran correctly. I created a buffer around the new output class to make sure that it truly wasn't within 1.5 km. The data appeared to be correct. Below is a screen shot of the script that I wrote to complete the tasks.





Exercise 8 Python Script

In exercise 8, the final part was to use python to create a weighted index on the environmental impacts. I decided that using residential areas as the weighted feature was the most important. That would be something that would be hard for the mine to get around so I made it more important. This was a relatively quick python script where I just set variables to all of the rasters used in the model. I then took the residential area and multiplied it by 1.5 in raster calculator in order to make it more important. THe last step was to take the weighted value and add it to all of the other rasters in raster calculator. The result was a weighted index. The python script is located below. 


Data Downloading, Interoperability, and Working with Projections in Python

Goal

The goal of this assignment was to become familiar with the process of downloading data from different sources on the internet, importing the data to ArcGIS, joining the data, projecting the data from these different sources into one coordinate system and building and designing a geodatabase to store the data. Finally, create maps showing our results of our python script that we wrote to accomplish the previous actions.


Methods

In this assignment, data collection was a huge section. The first half of the assignment was going out to different sources on the internet and downloading data sets from them. These were all government sites, so we know that they are all trusted data sources. Before we started downloading data, I went ahead and set up my data management. I made sure that I had an exercise 5 folder and a working folder inside of that. The TEMP folder on the university computer was used to hold the initial downloaded zip files. From there, they were extracted and put into our working folder which was further broken down into folders for each website where data was downloaded from. 

The first site that we visited was the US Department of Transportation. They supplied a transportation data set. In that data set were things from railway and road files. The AOI that was used was Trempealeau County. The next site that provided data was the USGS National Map Viewer. This provided data for the land cover of the Trempealeau County area. The two DEM rasters that resulted in one of our final maps was from this site. The next site was the USDA Geospatial Data Gateway. From this site, I obtained information on Trempealeau Counties cropland data layer. The source of data was directly form the Trempealeau County land records them self. This was probably the most important piece of data. It contained an entire geodatabase of the county and all of its relevant data. The last site that was used was the USDA NRCS web soil survey. This site provided another of the rasters that will be shown in the results. 

After the collection of data, it was time to import the SSURGO Data that we got from the soil survey. These were a large collection of tables. To import it, we used Microsoft Access. It imported it to a format that could be used in ArcMap. Next, I created a python script that was used to project, clip, and load all the data into the geodatabase. The script brought in all of the rasters that we gathered from the data sources, put them all in an appropriate coordinate system and then clipped them to the shape of the Trempealeau County border. Figure 1 below is a screenshot of the script that was run. 



figure 1. Python script used to project clip and load collected rasters into the geodatabase.



Data Accuracy




Final Maps



Conclusion

After completion of this assignment, I think that I have gotten a much better idea of how gathering data can be both extremely helpful and sometimes very frustrating. I am glad that we got to experience going out on the internet and gathering real data from different government sources. The use of python for basically the first time also shined a light on the many different ways to accomplish tasks in ArcMap and GIS in general.