Model Builder Project:
A Category 5 Nor'easter in Quebec City – Developing a Damage Assessment Report
Using ArcGIS 10.8.1 Python Shell and Model Builder Interface
Eidan W. Willis
Using ArcGIS 10.8.1 Python Shell and Model Builder Interface
Eidan W. Willis
The aim of the ENVB 530 Python Model Builder Mini-Project was to expand on our previous experiences using ArcMap's Model Builder and Python Shell in Lab 4 by developing and execute a comprehensive analytical framework in a made up real-world scenario. Students were provided with a catalog of data – mainly composed of shapefiles (point, line, and polygon) and some raster files (DEM rasters) – from which they could pick and choose; students were free to re-imagine the data they chose to derive a hypothetical scenario and conduct a series of geospatial analyses using ArcMap's Model Builder and the Python shell. Guidelines for the project were flexible, except for the following requirements that student's Python scripts were expected to meet. As directed, scripts should:
...contain a loop requiring at least 6 iterations. The number of iterations should be determined in one of the following ways:
You could iterate over a fixed number of items. This could be the number of bands or images to process, for example.
Your function could automatically create the list of items for iteration. If you are automatically generating the list (for example, by iterating over all points in a file, or iterating over all shapefiles in a data frame), that is perhaps more complicated, but is more flexible.
...perform a spatial analysis that results in an image. The analysis should include at least two layers.
As an example, you might use one of the layers to simulate bus stops, and compute the distance for every pixel to the nearest bus stop. You might use another layer to simulate metro stops of different colors, and compute the shortest distance to a stop on one of the two lines.
...write something to a text file when executed.
After you graduated from McGill, your first job was working for the Government of Quebec's Climate Disaster Response Team. After a few weeks of settling in, your informed that the strongest nor'easter ever recorded is headed directly for the Atlantic mouth of the St. Lawrence River. The storm, whimsically dubbed the Nor'easter Easter Bunny by Quebec City locals, is projected to produce winds reaching speeds of up to 280 kmh (~175 mph) and flooding of up to 6m (~10 ft)*. Upon hearing the news, Quebec's Ministers of Education (fun fact: there are two of them!), as with other officials, race to understand how the city will be affected. They reach out to you and anxiously explain that they just watched the critically acclaimed movie 2012, starring John Cusack, last week (i.e., the one where a tsunami taller than the Himalayas and destroys the world). As a result, they are paranoid that an impossibly high storm surge is going to wash away Quebec City and want you to:
*6m is really high for a storm surge...
The Ministers are paranoid that an impossibly high storm surge is going to wash away Quebec City. As such, they want you to:
...create storm surge damage projections at 6m, 10m, 20m, 30m, 45m, and 60m
...visualize which schools in the Quebec City region will be affected
...count and report the number of flooded schools
The Model Builder software within ArcMap was used to make a baseline model for the project's analytical process. This model was later exported as a python script (.py) and edited using IDLE to create an iterative script capable of creating six unique storm surge projections based on the provided data. As has been mentioned, students were provided with a large catalogue of shapefiles of all sorts with the intention that shapefiles of interest would be repurposed and reimagined for the project objectives. I began by converting the two separate DEM_East ("021l14_0200_deme.dem") and the DEM_West ("021l14_0200_demw.dem") raster files into a contiguous mosaic raster file using the Mosaic To New Raster tool. This tool required that the two input rasters have the same number of bands (i.e., 1), and same bit depth (i.e., 16_BIT_UNSIGNED). Since I want to use the resulting mosaic to conduct a variety of calculation-based geospatial analyses, it was important to ensure that the original geographic coordinates system was converted to a projected coordinate system based on the region within which the images were taken. I therefore used the NAD 1983 2011 UTM Zone 19N spatial reference, which corresponds to the region in question – Quebec City. The resulting mosaic, dubbed Mosaic_DEM, was sent to the Default geodatabase and was used to calculate a Slope layer, Slope_DEM, a Euclidean Distance layer, EucDist_Schools, and to develop our storm surge projections using the Raster Calculator tool.
For the slope calculation, I initially thought it necessary to develop a hillshade whose values would subsequently be used by the Slope tool to create the slope layer. I had a difficult time getting meaningful results through this procedure, however, so I instead decided to use the projected Mosaic_DEM layer as the sole input for this tool. Unlike the Slope tool, the Euclidean Distance required both a raster input and a point shapefile input. Two originally unrelated shapefiles – "021L14_silo_p.shp" and "021L14_trans_s_p.shp" – dubbed Schools_1 and Schools_2, respectively, were merged into one Schools_Merged shapefile. I chose to do this because the resulting merged shapefile looked like a realistic distribution of schools across the Quebec City region. In this way, the Euclidean Distance tool used the Schools_Merged shapefile in conjunction with the Mosaic_DEM raster to visualize the shortest possible distance between to given schools in the region.
It was originally intended that the slope layer would serve as the input to the Raster Calculator tool to create storm surge projections based on the pixel height, but, upon testing, this procedure yielded an incorrect result. Instead, I again decided to employ the Mosaic_DEM layer as the input for the Raster Calculator tool, which individually analyzed each pixel in the layer with respect to the parameters entered in the calculator window. Using Python syntax, we specify a map algebra equation that tells ArcMap to look for those pixels in the Mosaic_DEM raster that have an elevation value less than 6 meters. The equation is pretty simple and just requires the user to specify which input layer on which to apply the calculation, a boolean logical operator (e.g., =, <, ≤, >, ≥, etc.), and an integer value representing the storm surge level (i.e., 6 meters). The result is a selection of all pixels that match the True cases of the boolean operation (i.e., those pixels less than 6m). In order to perform any further analyses or selections, such as filtering for only those pixels bordering the Saint Lawrence river (i.e., hydrologically connected regions), we first need to Reclassify the selection as a Boolean raster file, where 0 represents False pixels (i.e., Elevation > 6m) and 1 represents True pixels (i.e., Elevation ≤ 6m).
A look at the selection containing only those regions within the 6m flood layer exhibiting hydrological connectivity (highlighted in cyan) vs. those that are not connected (not highlighted) to the Water layer representing the St. Lawrence River.
We then take the reclassified map algebra result and turn it into a shapefile using the Raster to Polygon tool, which will allow us to more easily manipulate the layer with further selection criteria. In the next step, we want to use the Select Layer By Location tool to perform two specific functions: 1) select those schools that fall within the flooded region designated by the new 6m shapefile, and 2) select only those pixels in the 6m storm surge shapefile that directly border the water shapefile representing the Saint Lawrence river and its tributaries. An important preprocessing step is that the Select Layer By Location tool would not take raw shapefiles as inputs, so they first had to be converted to feature layers using the Make Feature Layer tool. This conversion was done for only the Schools_Merged point shapefile in the first selection process (i.e., flooded schools) and for both the reclassified 6m polygon shapefile (ReclassDEM_Polygon6m) and the Water polygon shapefile (i.e., hydrologically connected regions). We again have a selection of those pixels in the shapefile that match the Select Layer By Location criteria; we can then use the Copy Features tool to turn our selection into a feature layer of its own. Lastly, schools in the flooded schools (i.e., Schools_Flooded) and the complete schools (i.e., Schools_CountAll) layers were counted using the Summary Statistics tool with the COUNT parameter specified in the tool's parameters pop-up window. This allowed us to get an idea of how many schools will be affected by the storm surge at 6m.
Figure 1. Full Model Builder workflow for my project.
After the above processes, we're left with a model that represents only one storm surge projection at 6 meters in height. If we want to perform the other storm surge projections requested by the Ministers of Education, we would have to create six of these side by side, which would be time consuming both in its creation and when running the model. Instead, we can export our model to a Python script and specify a for loop with which we can iterate over multiple storm surge depths. Using a list of Strings representing each of the six storm surge depths we're interested in (i.e., aListOfFloodDepths), we perform six different iterations using a hand-written for loop that performs several actions (detailed below) on each of the six storm surge depths indicated in the aListOfFloodDepths list variable. I thought it best to use the list of Strings with each list index acting as the input variable for the loop at each step in the increment. This method made it really easy to implement the for loop after ensuring to increment the list index at the end of each iteration. Another main reason guiding my decision to use this method was simply that it was intuitively quite easy for me to understand the looping process, making the debugging process much less frustrating. The loop performed the following actions:
1) Iterate through each of the six storm surge depths at 6m, 10m, 20m, 30m, 45m, and 60m.
2) Use the CreateConstantRaster tool and the LessThanEqual tool to perform the map algebra necessary to select those pixels at or below the given surge level*.
*Note: The Raster Calculator tool does not work in ArcMap's python shell; as a result, a workaround was created using the aforementioned tools to emulate the effect of the Raster Calculator at each of the six surge levels in Arcpy.
3) Use the Reclassify tool to create a raster file from the result of the previous operations, with 0 – pixels that do not fulfill the criteria specified by the LessThanEqual tool's boolean procedure (i.e., False pixels) – and 1 – pixels that do fulfill the criteria (i.e., True pixels).
4) Convert the reclassified flood raster to a polygon shapefile using the Raster To Polygon tool, allowing us to perform more meaningful manipulation later on.
5) With the Select Layer By Location tool, use the "INTERSECT" parameter to select those schools that intersect with the newly reclassified polygon shapefile representing the storm surge extent.
6) Convert your selection to a feature layer using the Copy Features tool.
7) Convert the reclassified polygon DEM output and the Water shapefile to a feature layer to be able to manipulate it in the following step.
8) Using the reclassified DEM and Water feature layers, use the Select Layer By Location tool with the "CROSSED_BY_THE_OUTLINE_OF" parameter to select for only those flooded regions that are hydrologically connected to the water boundary.
9) As in Step 6, use the Copy Features tool to convert all of your hydrologically connected flood regions to a feature layer.
10) Count the number of flooded schools using the Summary Statistics tool with the parameter set as "OBJECTID COUNT".
11) Append your changes to the .txt file using the with open([XX.txt]) as [variable name] format; appending, rather than writing, ensures that your changes aren't overwritten at each step in the iteration. In theory, this would allow me to write all the summary statistics information I'm interested in into the .txt file in one loop at each step of the iterative process.
12) Iterate through your list, ensuring that the list counter (i.e., the index of the aListOfFloodDepths at a given step in the loop) and scenario counter (i.e., a counter used to write to the text file) variables are successfully iterating through all the storm surge depths specified.
All files are saved iteratively and uniquely for each of the six flood depths using the same aListOfFloodDepths list. Make sure that the drive where you save your files has enough space as this is a relatively expensive process!
Running the script with the aforementioned loop yields six different collections of layers that compile to create maps that display the following:
1) The extent of the flooding at each of the six flood heights (i.e., height calculations were based on elevation values from the Mosaic_DEM raster file). These maps are seen below.
2) Accompanying point shapefiles visualizing those schools flooded in each of the six scenarios.
3) A composite layer visualizing all six flood projections, both extent and schools affected (top right).
4) Six different tables identifying the number of schools affected by the storm at each surge height*.
*Unfortunately, these tables were added directly to the Default geodatabase and are only accessible upon opening the .mxd file containing my project within ArcMap.
5) A Euclidean Distance layer visualizing shortest possible distances between any two schools in the region (bottom right).
6) A .txt file meant to contain information on affected schools in each of the six flood regions, but something went awry (screenshot below; details in discussion section).
A composite layer containing all of the storm surge scenarios layered on top of one another from shallowest (6m in yellow) to deepest (60m in magenta). The original Water layer is hashed out.
The resultant Euclidean Distance layer comparing schools, both affected and unaffected by the storm, displaying the shortest distance between any two schools.
Projected damage extent (and affected schools) at a 6 meter tall surge.
Projected damage extent (and affected schools) at a 20 meter tall surge.
Projected damage extent (and affected schools) at a 45 meter tall surge.
Projected damage extent (and affected schools) at a 10 meter tall surge.
Projected damage extent (and affected schools) at a 30 meter tall surge.
Projected damage extent (and affected schools) at a 60 meter tall surge.
The output (incorrect)– written to a .txt file – created after executing the python script in ArcMap.
This project proved to be an extremely time consuming – at times frustrating – endeavor, but I'm quite happy with the results. Storm surge projections at 6m, 10m, 20m, 30m, 45m, 60m look to be correctly calculated based on the information provided, suggesting that the Mosaic I produced from the two East and West DEM raster layers was projected correctly in meters as well. The composite layer is useful as it provides a quick and easy way of distinguishing all six projections from one another; viewers can also easily tell which schools were affected in each of the scenarios and draw conclusions. With respect to the project goals, the next step would have been to develop evacuation plans for those schools affected in each scenario. When brainstorming this deliverable, I imagined myself using the Least Cost Path method to direct populations at affected schools to the nearest unaffected school, except in cases where the nearest unaffected school were more than 1km away. In this case, I would have used a separate shapefile containing point data representing points of high elevation in the Quebec City region, and performed Least Cost Path analyses between affected schools and the nearest points in this shapefile that were also unaffected in the surge. I could also see it useful/important to filter those unaffected schools and points of elevation in the region by the absolute worst case flood scenario of 60m in order to make absolutely certain that populations remain safe regardless of the actual intensity of the storm.
The Euclidean Distance layer I created seemed to be incorrectly projected, as the layer extended beyond the actual map extent. This can be seen at the bottom edge of the layer, where the layer extends beyond the correctly-projected Roads shapefile. I believe this was the result of incorrectly setting the Processing Extent environmental parameter in the Euclidean Distance tool parameters window within the model. Because this layer was not a central component of my project, however, it did not impact the main storm surge scenario deliverables at all.
Another error exists in the python script, where the Summary Statistics tool used to write the number of schools affected in each of the flood scenarios to the .txt file seems instead to be writing a String value containing the path of the actual summary statistics table, rather than the content of the summary statistics itself. As depicted in the following code snippet, the script is attempting to write several contextual Strings describing the data alongside the floodedCount variable, which represents the actual integer value of the number of flooded schools. When I saved the floodedCount variable directly to the Default geodatabase and opened the file within ArcMap, the table contained the correct information. As a result, I believe this error lies in the ability of the Python shell to write the output of the arcpy.Statistics_analysis() method to a .txt file. There may be a workaround out there, but I imagine it has something to do with a formatting incompatibility between .txt and .csv files (i.e., tables cannot be directly input into .txt files) but I'm not 100% certain that this explains it.
f.write("Number of schools flooded in Scenario " + str(scenarioCounter) + "(" + each_flood + "): " + floodedCount)
Limitations
1) No Raster Calculator equivalent in Arcpy
Root of issue: The Raster Calculator tool available in ArcMap and Model Builder does not have an arcpy counterpart, meaning that I could not use this tool for my script.
Solution: I found a workaround using the CreateConstantRaster and LessThanEqual tools in tandem to do the following:
Create a constant raster layer with the CreateConstantRaster tool where each pixel in the raster contains an integer value corresponding to the storm surge height in each scenario.
Use the LessThanEqual tool with the two inputs being the Mosaic_DEM raster and the newly created constant raster layers; the result is a boolean layer containing those pixels that are less than or equal to its corresponding constant raster pixel (i.e., flood level).
Use the Reclassify tool to filter for only those pixels meeting the parameters of the logical operation.
2) Summary Statistics did not write correctly to .txt file
Root of Issue: Summary Statistics output may not be writable to a .txt file because of potential formatting incompatibility with .csv files.
Solution: Find a workaround where we extract the raw count data on flooded schools from the output table, perhaps using Extract By Attributes tool (or some table-exclusive alternative if that only works for layers).
3) Storage limitations
Root of Issue: Each iteration in the for loop created several files and was storage intensive. As a result, I was unable to store my data in the P:\ drive of my remote desktop.
Solution: I stored my data in the C:\ drive of the remote desktop I was working on; this ended up biting me in the butt later on as any updates or changes needed to be saved on the drive, then downloaded to my laptop or Google Drive. This meant that I could only access my data on the remote PC if I neglected to redownload the data each time I made any significant changes.
Better Solution: Either a cloud drive with more space on the remote PC or actually work at the PC in person (be sure to save your data onto a thumb drive!!).
4) Needed more time: Least Cost Path Analysis – Flooded Schools vs. Unaffected schools
Root of Issue: I didn't have enough time to implement this method in the project.
Solution: More time!
5) Needed more time: Least Cost Path Analysis – Flooded Schools vs. High Elevation Points
Root of Issue: ibid.
Solution: ibid.
Challenges
1) Euclidean Distance projection was off
Root of Issue: The resulting layer seems incorrectly projected, exceeding the map extent.
Potential Solution: I thought this had something to do with using the wrong input in the Processing Extent parameter within the Euclidean Distance environments window in Model Builder. I specified the parameter to be "Same as Mosaic_DEM", which might mean that the extent or pixel size of the Mosaic is incompatible with Euclidean Distance calculations. If this is the case, I would create a slope layer with the Slope tool and use that layer as the input instead, hopefully with more successful results.
2) Select Layer By Location required different input
Root of Issue: This tool requires inputs to be in feature layer format (or it just doesn't like shapefiles? Not really sure)
Solution: Using the Make Feature Layer tool as a preprocessing step to ensure that any inputs are in the correct format. To my knowledge, solution did not have any deleterious effects on successive analyses in the model.
3) Summary Statistics output was incorrect
Root of Issue: Summary Statistics output to the .txt file was a String of the path to the output file, not the contents of the table (i.e., the integer value representing the number of flooded schools)
Solution: Same as Limitation #2.
4) Remote Desktop is a pain in the butt
Root of Issue: Ranging from storage issues and automatic shutdowns due to pre-scheduled classtimes in the physical computer lab at Mac Campus, to being unable to access the PC with my data on it because other people were using it...
Solution: Could have probably been smarter with my storage techniques, but you can't really avoid other people using the computer when you need to. I had to drive to Mac Campus just to as the person using the PC if I could download my data onto a thumb drive, then drive back downtown...yikes!
# -*- coding: utf-8 -*-
# ---------------------------------------------------------------------------
# PythonMBProject.Willis.Eidan.7.py
# Created on: 2022-02-23 16:30:23.00000
# (generated by ArcGIS/ModelBuilder)
# Description:
# ---------------------------------------------------------------------------
# Import arcpy module
import arcpy
from arcpy.sa import Raster, Int, Float, CreateConstantRaster, LessThanEqual
arcpy.CheckOutExtension('spatial')
# Local variables:
Vegatation = "Vegatation" #for visualization
Builtup = "Builtup" #for visualization
Buildings = "Buildings" #for visualization
Roads = "Roads"
Schools_Layer__3_ = Schools_Layer
#ReclassDEM_Polygon6m_Layer__2_ = ReclassDEM_Polygon6m_Layer
outDir = "C:\\Users\\ewilli13\\documents\\ArcGIS\\Default.gdb\\" #output directory
# Set Geoprocessing environments
Mosaic_DEM = outDir + "Mosaic_DEM"
arcpy.env.extent = Mosaic_DEM
arcpy.env.cellSize = Mosaic_DEM
arcpy.env.overwriteOutput = True
# Process: Mosaic To New Raster
DEM_East = "DEM_East"
DEM_West = "DEM_West"
Default_gdb = outDir
arcpy.MosaicToNewRaster_management((DEM_East + ";" + DEM_West), Default_gdb, "Mosaic_DEM", "PROJCS['NAD_1983_2011_UTM_Zone_19N',GEOGCS['GCS_NAD_1983_2011',DATUM['D_NAD_1983_2011',SPHEROID['GRS_1980',6378137.0,298.257222101]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Transverse_Mercator'],PARAMETER['False_Easting',500000.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',-69.0],PARAMETER['Scale_Factor',0.9996],PARAMETER['Latitude_Of_Origin',0.0],UNIT['Meter',1.0]]", "16_BIT_SIGNED", "", "1", "LAST", "FIRST")
# Process: Slope
Slope_DEM = outDir + "Slope_DEM"
arcpy.Slope_3d(Mosaic_DEM, Slope_DEM, "DEGREE", "1", "PLANAR", "METER")
# Process: Merge
Schools_1 = "Schools_1"
Schools_2 = "Schools_2"
Schools_Merged = outDir + "Schools_Merged"
arcpy.Merge_management((Schools_1 + ";" + Schools_2), Schools_Merged, "ENTITYNAME \"ENTITYNAME\" true true false 40 Text 0 0 ,First,#,Schools_1,ENTITYNAME,-1,-1,Schools_2,ENTITYNAME,-1,-1;NTS \"NTS\" true true false 6 Text 0 0 ,First,#,Schools_1,NTS,-1,-1,Schools_2,NTS,-1,-1;EDITION \"EDITION\" true true false 16 Double 0 16 ,First,#,Schools_1,EDITION,-1,-1,Schools_2,EDITION,-1,-1;VERSION \"VERSION\" true true false 16 Double 0 16 ,First,#,Schools_1,VERSION,-1,-1,Schools_2,VERSION,-1,-1;THEME \"THEME\" true true false 2 Text 0 0 ,First,#,Schools_1,THEME,-1,-1,Schools_2,THEME,-1,-1;ID \"ID\" true true false 16 Double 0 16 ,First,#,Schools_1,ID,-1,-1,Schools_2,ID,-1,-1;CODE_GENER \"CODE_GENER\" true true false 16 Double 0 16 ,First,#,Schools_1,CODE_GENER,-1,-1,Schools_2,CODE_GENER,-1,-1;ATC \"ATC\" true true false 16 Double 0 16 ,First,#,Schools_1,ATC,-1,-1,Schools_2,ATC,-1,-1;ATG \"ATG\" true true false 16 Double 0 16 ,First,#,Schools_1,ATG,-1,-1,Schools_2,ATG,-1,-1;ATZ \"ATZ\" true true false 16 Double 0 16 ,First,#,Schools_1,ATZ,-1,-1,Schools_2,ATZ,-1,-1;ATE \"ATE\" true true false 16 Double 0 16 ,First,#,Schools_1,ATE,-1,-1,Schools_2,ATE,-1,-1;ACCURACY \"ACCURACY\" true true false 11 Double 0 11 ,First,#,Schools_1,ACCURACY,-1,-1,Schools_2,ACCURACY,-1,-1;TYPE \"TYPE\" true true false 11 Double 0 11 ,First,#,Schools_1,TYPE,-1,-1,Schools_2,TYPE,-1,-1;ELEVATION \"ELEVATION\" true true false 31 Double 15 30 ,First,#,Schools_1,ELEVATION,-1,-1,Schools_2,ELEVATION,-1,-1;ANGLE \"ANGLE\" true true false 31 Double 15 30 ,First,#,Schools_1,ANGLE,-1,-1,Schools_2,ANGLE,-1,-1")
# Process: Euclidean Distance
EucDist_Schools = (outDir + "EucDist_Schools")
arcpy.gp.EucDistance_sa(Schools_Merged, EucDist_Schools, "", Mosaic_DEM, Output_direction_raster, "PLANAR", Mosaic_DEM, Output_back_direction_raster)
# Process: Make Feature Layer (2)
Water = "Water"
Water_Layer = "Water_Layer"
arcpy.MakeFeatureLayer_management(Water, Water_Layer, "", "", "FID FID VISIBLE NONE;Shape Shape VISIBLE NONE;ENTITYNAME ENTITYNAME VISIBLE NONE;NTS NTS VISIBLE NONE;EDITION EDITION VISIBLE NONE;VERSION VERSION VISIBLE NONE;THEME THEME VISIBLE NONE;ID ID VISIBLE NONE;CODE_GENER CODE_GENER VISIBLE NONE;ATC ATC VISIBLE NONE;ATG ATG VISIBLE NONE;ATZ ATZ VISIBLE NONE;ATE ATE VISIBLE NONE;ACCURACY ACCURACY VISIBLE NONE;TYPE TYPE VISIBLE NONE;ELEVATION ELEVATION VISIBLE NONE;CENTROID_X CENTROID_X VISIBLE NONE;CENTROID_Y CENTROID_Y VISIBLE NONE")
# Process: Make Feature Layer
Schools_Layer = "Schools_FeatureLayer"
arcpy.MakeFeatureLayer_management(Schools_Merged, Schools_Layer, "", "", "OBJECTID OBJECTID VISIBLE NONE;Shape Shape VISIBLE NONE;ENTITYNAME ENTITYNAME VISIBLE NONE;NTS NTS VISIBLE NONE;EDITION EDITION VISIBLE NONE;VERSION VERSION VISIBLE NONE;THEME THEME VISIBLE NONE;ID ID VISIBLE NONE;CODE_GENER CODE_GENER VISIBLE NONE;ATC ATC VISIBLE NONE;ATG ATG VISIBLE NONE;ATZ ATZ VISIBLE NONE;ATE ATE VISIBLE NONE;ACCURACY ACCURACY VISIBLE NONE;TYPE TYPE VISIBLE NONE;ELEVATION ELEVATION VISIBLE NONE;ANGLE ANGLE VISIBLE NONE")
aListOfFloodDepths = ["60",
"45",
"30",
"20",
"10",
"6"]
#Output file to write to
outputFile = open("stormSurgeOutput.txt", "w")
outputFile.write("Results of storm surge damage assessments at 6m, 10m, 20m, 30m, 45m, and 60m are displayed below: \n \n")
#Counter Variables
listCounter = 0
scenarioCounter = 1
for each_flood in aListOfFloodDepths:
each_flood = aListOfFloodDepths[listCounter]
print aListOfFloodDepths[listCounter]
## # Process: Raster Calculator
## rasterCalcOutput = (outDir + "DEM_rastercalc" + each_flood + "m")
## inExpression = ("\%Mosaic_DEM\%" + "<=" + "8")
## arcpy.gp.RasterCalculator_sa(inExpression, rasterCalcOutput)
#Note: Raster Calculator does not work within Python shell.
#Work-around: CreateConstantRaster and LessThanEqual
constantValue = int(each_flood)
constantRaster = CreateConstantRaster(constantValue, "INTEGER")
constantRaster.save(outDir + "constantRaster_" + each_flood + "m")
lteOutput = LessThanEqual(Mosaic_DEM, constantRaster)
lteOutput.save(outDir + "DEM_lteOutput" + each_flood + "m")
# Process: Reclassify
reclassOutput = (outDir + "Reclass_DEM" + each_flood + "m")
arcpy.gp.Reclassify_sa(lteOutput, "Value", "0 NODATA;0 1 1", reclassOutput, "NODATA")
# Process: Raster to Polygon
tempEnvironment0 = arcpy.env.outputZFlag
arcpy.env.outputZFlag = "Disabled"
tempEnvironment1 = arcpy.env.outputMFlag
arcpy.env.outputMFlag = "Disabled"
reclassPolygonOutput = (outDir + "ReclassDEM_Polygon" + each_flood + "m")
arcpy.RasterToPolygon_conversion(reclassOutput, reclassPolygonOutput, "SIMPLIFY", "Value", "SINGLE_OUTER_PART", "")
arcpy.env.outputZFlag = tempEnvironment0
arcpy.env.outputMFlag = tempEnvironment1
# Process: Select Layer By Location
arcpy.SelectLayerByLocation_management(Schools_Layer, "INTERSECT", reclassPolygonOutput, "", "NEW_SELECTION", "NOT_INVERT")
# Process: Copy Features
Schools_Flooded = (outDir + "Schools_Flooded" + each_flood + "m")
arcpy.CopyFeatures_management(Schools_Layer, Schools_Flooded, "", "0", "0", "0")
# Process: Make Feature Layer (3)
reclassPolyLayerOutput = ("ReclassDEM_Polygon" + each_flood + "m_" + "Layer")
arcpy.MakeFeatureLayer_management(reclassPolygonOutput, reclassPolyLayerOutput, "", "", "OBJECTID OBJECTID VISIBLE NONE;Shape Shape VISIBLE NONE;Id Id VISIBLE NONE;gridcode gridcode VISIBLE NONE;Shape_Length Shape_Length VISIBLE NONE;Shape_Area Shape_Area VISIBLE NONE")
# Process: Select Layer By Location (2)
arcpy.SelectLayerByLocation_management(reclassPolyLayerOutput, "CROSSED_BY_THE_OUTLINE_OF", Water_Layer, "", "NEW_SELECTION", "NOT_INVERT")
# Process: Copy Features (2)
hydroConnectedOutput = (outDir + "DEMPoly_" + each_flood + "m_HydroConnected")
arcpy.CopyFeatures_management(reclassPolyLayerOutput, hydroConnectedOutput, "", "0", "0", "0")
# Process: Summary Statistics
floodedCount = arcpy.Statistics_analysis(Schools_Flooded, Schools_Flooded_Count, "OBJECTID COUNT", "")
# Finalize: Write to outputFile
with open("stormSurgeOutput.txt", "a") as f:
f.write("Storm Surge Scenario" + str(scenarioCounter) + ": " + each_flood)
f.write("Number of schools flooded in Scenario " + str(scenarioCounter) + "(" + each_flood + "): " + floodedCount)
listCounter = listCounter + 1
scenarioCounter = scenarioCounter + 1
#End Loop
# Process: Summary Statistics (2)
totalCount = arcpy.Statistics_analysis(Schools_Merged, Schools_CountAll, "OBJECTID COUNT", "")
with open("stormSurgeOutput.txt", "a") as f:
f.write("Total number of schools in the region: " + totalCount)
#Close the file
outputFile.close()
#End of Script