Saturday, March 12, 2011

Final Project



 Introduction
With growing gas prices and an increased demand to limit our carbon footprint in California, public transit oriented living spaces are on the rise.  It is a movement that will alter the urban landscapes, which are currently extremely dependent upon the use of private automobiles, by locating new developments within closer access to public transit hubs as well as creating an environment where people live, work, shop, and play all within close proximity to each other.  By created an urban environment where people either live close enough to walk to work or close enough to public transit to walk to it people will have the option of to become less reliant on their car.  By changing the downtown districts from only shopping or only business zones into mixed-use areas new developments will need to be implemented which means either vacant parcels will be utilized or pre-existing building will be altered to meet these new demands. In two Northern California counties, Sonoma and Marin, this may become a reality in 2014 when the SMART (Sonoma Marin Area Rail Transit) system is completed. The SMART system is constructing a rail line from Larkspur to Cloverdale with stops expected in all the major Sonoma County towns along highway 101, which includes Petaluma, Cotati, Rohnert Park, Santa Rosa, Windsor, Healdsburg, and Cloverdale and towns within Marin County.  The rail line will connect to the Larkspur Ferry which connects to the ferry building in San Francisco, making it possible for someone living in Cloverdale to travel all the way to downtown San Francisco without the use of a car. The rail line is expected to start moving passengers by the summer of 2014, which means the areas surrounding the future train station locations have approximate 3 years to develop potential sites.  The new sites will have to be close enough to the train station to walk there but also close enough to walk to other downtown amenities such as shopping and jobs. The new sites will have to be large enough to build high density, mixed-income, transit oriented developments while at the same time remain close enough to downtown centers to allow people access to shopping, restaurants, jobs, and other basic necessities by foot or public transit.  With 14 stations proposed along the SMART system the door opens for development firms in those 14 different locations to begin looking at potential sites for development. I chose to examine the potential sites surrounding the Proposed Healdsburg Train Station because it is the town I grew up in and I would love to see a transit oriented high density residential complex close enough to walk to downtown while still be able to access the greater bay area without having to own a car.        
Methods
My ultimate goal was to select a single parcel to develop with walking distance to the new train station in Healdsburg while also remaining within walking distance to downtown.  The parcel would need to be large enough to build high density living while remaining affordable enough to make a profit. For these parameters I needed to limit parcels to those larger than 30,000 Square Ft and decline parcels over $4 million. Other factors that would limit my search for the optimal parcel would be slope, hydrology, and land use zoning for the city of Healdsburg.  To accomplish this I had to obtain data from various sources starting at the state level working my way down to the city level and final the individual parcel level.  I began by clipping all of the layers of analysis to the Healdsburg city boundary as most of them were obtained at the countywide extent.  I then geocoded the Healdsburg train station using latitude and longitude data I obtained from the SMART website.  I then created a ¼ mile buffer around the train station and subsequently another ¼ mile buffer around the downtown plaza.  I then used a select by location query to select only the parcels within the intersection of the two buffers.  I then created a 50ft buffer around all streams in the downtown environment, with which I was able to select any of the preselected parcels that intersected this buffer and then removed them as any overlap between streams and the new development would be unacceptable.  The next step was to remove any parcels from the current selection that overlapped any unacceptable slope ratings of greater than 5%, using an attribute query to select any parcels overlapping a slope greater than 5%. The next step involved evaluating the zoning restrictions on parcels within the city of Healdsburg.  I chose to only accept parcels zoned as either commercial or residential as my new development site would incorporate both and the zoning would have to be altered either way, but at least one of the two would be allowable.  After refining my search down to approximate 50 parcels I used an attribute query to select only parcels greater than 30,000 square ft.  I then used another attribute query to select only parcels that were appraised under $4 million dollars

Results
 After every layer was incorporated into the analysis only one parcel remained which was excellent because it made making a final decision very simple. The slope layer while necessary didn’t have a large impact on selecting parcels because the greater downtown area is relatively flat, as are most downtown arenas. The streams buffer of 50ft also played a minimal role in eliminating parcels as only a small section of the stream ran through the western edge of the downtown ¼ mile buffer. The location, area, and price factors had the largest impact on deciding which parcels were suitable. I’m sure in other locations along the SMART route the different factors will rotate in relevancy.  My final parcel was located at 122 Mill Street which is currently a commercial building.  The parcel met all of my requirements which included being 60,000 square ft, and its estimated cost was $2,578,000 according to the city of Healdsburg in 2009. This parcel which is located directly in between the proposed train station and the downtown Healdsburg shopping district will be perfect for developing a transit oriented, mixed use, living complex where people can live, work and shop within close proximity.             
Conclusion
The processes used in this example could be implemented for the remaining 13 SMART station locations to locate similar developable parcels.  In this way throughout Sonoma and Marin County each downtown area could equip itself with high density transit oriented living spaces.  By seeking only parcels that meet the requirements for development this method can insure the success of the projects that locate in these areas.  Location and proximity are extremely powerful factors in this type of development plan and require a precise analysis to ensure success. This plan when implemented correctly can alter the urban landscape in a positive way by changing how people move throughout the urban landscape. In turn, having more people using the public transit system would also have positive impacts on the main 101 freeway system which is often overcrowded with cars.  The potential for investors to locate ideal developable sites within each of these communities creates great financial opportunities while at the same time creating new options for those who would rather break the ties with their automobile.  All in all the new demand for transit oriented living spaces is going to create a demand for developable parcels near transit hubs and finding lucrative parcels is going to be dependent upon the strict limitations that are set by potential investors, which in this case is me.   
Parcel selection within the .25 mile train station buffer
Final Parcel Map
    

Friday, February 25, 2011

Interpolation Lab

 





Interpolation is the process of determining unknown values based on their spatial relationship to existing values. Without knowing specific values for specific locations, interpolation can infer the value of that location based on its relationship to the locations with known value. Many methods of Interpolation exist, each calculating the unknown values in a unique way. The methods used in this lab are IDW (Inverse Distance Weighting), Kriging, and Spline.  IDW uses a linear-weighted combination set of sample points to determine unknown value. It uses the theory that the farther two cells are from each other the less influence they have one each other. This follows Tobler’s law of distance decay where things generally become less and less connected as distance between them increases. So points that are closer to the known values are weighted heavier than points that are farther from the known values. Kriging on the other hand “assumes that the distance or direction between sample points reflects a correlation that can be used to explain variation in the surface”,(Childs 3). Kriging predicts values of a specified radius using a sophisticated weighted average technique.  The Spline method estimates values in an attempt to minimize surface curvature. It attempts to create a smooth layer that runs through all input values filling in the unknown values in between.

For this lab we interpolated the current amount of rainfall in Los Angeles County along with traditional Normal rainfall values for Los Angeles County. First we constructed a table in excel with all the rain gauge stations in L.A. county and added their latitude and longitude coordinates, which we had to convert into decimal degree format.  Columns for current rainfall and normal rainfall were made and then the excel table was exported as a .dbf file.  In Arcmap a Los Angeles county Boundary shapefile was added and then the excel table was added. The rain gauge stations were displayed using “display X-Y values”, based on the Latitude/longitude coordinates.  Two different interpolation methods were then applied to the data points resulting in two different outputs. One was based on the IDW method.  The other was based on the Kriging method. Both these methods were applied to the Normal rainfall value and the Current rainfall values. Once the interpolations were performed difference maps were made using the raster calculator to show the difference between current rainfall value and normal rainfall values.  It appears that the eastern part of the county received more rainfall than the western portion, and the current rainfall value are less than what normal rainfall values usually are.

I think the Inverse distance weighting technique is the best interpolation method to use for this task. Because of the nature of precipitation usually there is a gradual change in values, with areas of similarity generally decreasing with distance from each other. The IDW method is a deterministic interpolation technique that creates a surface based on measured points where as kriging is a geostatistical approach which is a more advanced surface prediction technique. Kriging would not be as accurate for this data set because a directional bias in the data values is not known. 

Tuesday, February 22, 2011

Spatial Analysis 2


As global climate change continues to alter local and state patterns of vegetation growth and weather an ever increasing demand for rapid collection and analysis of data is created.  As climate change is increasing average summertime temperatures and decreasing rainfall throughout the winter more extreme fire conditions are being created that are sweeping through Northern and Southern California.  Fire crews are burdened by more work than they can possibly handle without calling in extra support from all over the state and nation.  Fire crews can be more efficient if they are given as much information about potential fire hazard areas before the fires start. This way they can address these areas using mitigation tactics before fires start.  If fire crews know that an area is prone to extreme fire behavior they can perform vegetation management techniques such as clearing the brush ahead of time, doing controlled burns, or speaking to homeowners about managing their properties responsibly.

A site analyses can be used like the one in this lab to determine areas that are more hazardous than others when exposed to fire. Factors in this lab were slope and vegetation type. Of course other factors exist in determine the extremity of a fire like wind, relative humidity, and elevation, but for the purposes of this lab we are only taking into account these two factors.  The lab focus was on the station fire location which occurred in Los Angeles County in the summer of 2010. The slope factor was created from a digital elevation model obtained from the USGS seamless server application, using the spatial analysis/slope tool in ArcGis. Once a slope layer was created it was then reclassified into different categories of increasing hazard levels, steeper being more hazardous.  A similar task was performed using a raster layer of vegetation types throughout Los Angeles. The vegetation layer was converted into vector format and then reclassified using NFPA vegetation standards to classify the hazard level of each vegetation type.  Once both slope and vegetation layers were reclassified into hazard level layers they were combined into one hazard layer using the raster calculator. The layer you see covering the station fire shows the overall hazard levels given slope and vegetation type.

The map that was created during this lab can be extremely useful in explaining why the Station fire was as intense and devastating as it was. Almost the entire area was very steep and covered by chaparral type vegetation that burns very well. Other factors that are not displayed in the map are the wind and the temperatures during the fire.  These factors also play a very large role in a fires’ ferocity, but are more difficult to create a map layer from because of their rapidly changing nature.  Aside from these minor deficiencies, site analyses can still be an extremely powerful tool when implemented into certain fields such as fire behavior.  With the addition of some remote sensing technology and a fast computer processor this operation could be done in a matter of hours for the entire state of California making it extremely valuable for wildland fire department throughout the country.  

Tuesday, February 15, 2011

Site Suitability Analysis


The implications of installing a new landfill or expanding an existing landfill can be extremely controversial and extremely complex with the possibility of endangering many people if precautions are ignored or overlooked.  California’s largest toxic waste landfill located in the central valley in Kettleman city plans to expand its facility, but concerned state officials placed a moratorium on the expansion until an investigation regarding effects of the landfill on drinking water is completed.  Reports of higher than normal levels of arsenic have been found along with birth defects in the area.  Whether these are directly related to the landfill or a byproduct of the pesticides and other chemicals widely used by the agricultural community has yet to be determined.

As far as the expansion of landfill, a site suitability analysis still needs to be performed. A site suitability analysis takes many different factors into account and then determines where the best locations for a specific activity to take place should be, or where a specific structure should be built.  Different activities as well as different types of structures obviously use different factors to narrow down the best place. Generally a site suitability analysis finds locations where all the desirable factors are overlapping, or where all the undesirable locations are not present.  It is a very efficient way to determine the best site for building, or performing a certain activity.

In the case of the landfill expansion site suitability test five factors were taken into account to determine the best locations to enlarge the current landfill.  Slope, distance from streams, land cover, distance from the existing landfills, and the drainage of different type of soils were the five factors that were used to determine the ideal expansion sites.  While some site suitability analyses treat each factor equally our analysis weights each factor based upon its overall importance. The type of soil and its drainage properties along with the slope of the terrain were each weighted at .3, while distance from streams was weighted as .2; land cover and distance to existing landfills were weighted as .1. The weighting of all five factors gave a final output combining all five factors into one standardized ranking system.

While a site suitability analysis is very useful for visualization of an areas viable options for expansion, it must also be backed up by field operations.  The city of Kettleman may already be experiencing negative effects of the landfill and would be further impacted by any type of expansion.  The investigation must be completed determining the impacts of the landfill in its current state.  In general site suitability test are useful in determining viable areas that should then be followed up by extensive field work to assess the areas that are being suggested.

Overall Suitability Analysis Function is an extremely powerful tool to have access to.  It not only saves huge amounts of time, labor, and money it is one of the most efficient ways of assessing land suitability.  While it is an extremely useful guide to model reality, it should never substitute reality.  Suitability analyses should always be confirmed in the field and many more tests should be conducted to confirm your finding.  Suitability Analysis is to be used as a model that should always backed up by tangible evidence such as drainage properties, slope surveys, soil composition, watershed examinations, etc.  Suitability analyses will continue to be a valuable tool as long as they are used responsibly and are always checked for accuracy.    

Wednesday, February 2, 2011







I believe the concept of limiting Medical Marijuana dispensaries to locate in only in areas that are not within 1000ft of areas that children and adolescents congregate, such as schools, parks, and libraries, is a good concept that should be implemented if more dispensaries are to be developed.  It is quite clear from the map on the right that plenty of viable options for new locations are available within the city of Los Angeles. I don't think medical marijuana dispensaries rely on their 1000 foot proximity to youth hubs, nor do medical marijuana patients find any more inconvenience from traveling the extra distance.  I think not having the dispensaries within 1000ft of schools, libraries, and parks helps to remove some of the temptation of  youths attempting to buy marijuana and helps to promote learning in schools by removing any potential distraction.  Medical Marijuana Dispensary Owners might argue that these 1000ft buffer zones remove potentially profitable areas to place their business, but I think the map shows that many options are still left open. On another more moral note, are we, as capitalist citizens of the California, more concerned with whether a small business profits or whether our children are learning and playing in a safe environment that is free of temptations to start using marijuana which may lead to further drug use. Overall I think the main issue here is to limit the exposure to children, but still have enough locations for the patients to access their marijuana needs.  By mapping areas that have high concentrations of children and excluding dispensaries this will be achieving, it will also reveal all the locations where a dispensary might be needed.  While the limitations stated in the LA times article limiting new dispensaries to 70 is also a good idea because it would hopefully distribute the dispensaries throughout Los Angeles. Another way to achieve this might be to put specific limitations on the urban densities allowed. like for example applying buffer zones on each new dispensary to limit densification. I think the ideas are good but there is still much more that can be done to limit the exposure to children while protecting the livelihood of the dispensaries and their patients.

Friday, January 21, 2011

week 3 Lab 3 - Geocoding

In my geocoding map lab project I first decided to map bicycle shops in San Francisco.  I acquired 50 addresses and made an excel table organizing the Name, address and zip code into a table to use in arcmap.  I downloaded a streets layer shape file from the San Francisco government website and made an address locator from it.  I then geocoded the 50 bike shop addresses onto the San Francisco streets reference layer.  Most of my addresses matched quite easily, but those that hadn’t were then matched manually.  By geocoding the addresses I was able to display the distribution of bike shop throughout the San Francisco peninsula with great accuracy. 
After realizing how easy this can be once the steps become familiar, standard and repeatable I decided to complicate the analysis further by adding geocoded Brewery locations.  I wanted to find breweries that were close to bike shops and vice versa, bike shops that were close to breweries.  Bike shops and breweries are places that I frequent often and I wanted to know where it would be most convenient to visit in order to efficiently experience both. 
I made another table with brewery locations and then geocoded those addresses as well, on the same reference layer.  I then made 1000ft buffers around both breweries and bike shops and did a “search by location” function to result in bike shops within 1000ft of breweries and a separate function to find breweries within 1000ft of bike shops.  I made separate symbols for the locations that were close to each other and then labeled them so you could find them on the address table.   
Overall the function of geocoding is extremely useful and can be very quick and easy if your addresses are in a format that imports into Arc easily (excel).  This feature makes it very useful for performing analyses on any type of location data such as proximity, or containment, or just displaying the locations of various places of which you have addresses for.