Showing posts with label ArcGIS 10. Show all posts
Showing posts with label ArcGIS 10. Show all posts

Monday, September 10, 2012

ArcGIS 10.0 Implementation of the Canny Edge Detector

The following is my attempt at an almost pure ArcGIS implementation of the Canny Edge Detector. This would be much easier to apply in Python/NumPy (or in 1 line using the canny function in scikits), but I wanted to be able to be able to apply it to very large rasters that exceed the 2GB limit of the 32bit NumPy that is installed with ArcGIS.

I found this site and this site both really helpful for actually implementing the Canny Edge Detetor. Definitely refer to both of them for more examples and pictures.

I did use Python/NumPy for generating the Gaussian Filter kernel files that were needed for the ArcGIS Focal Statistics function. The following code will generate a properly formatted kernel file based on the kernel size and sigma value.
import os
import numpy as np
def gaussian_kernel(workspace, size=5, sigma=1.4):
    sigma_str = str(round(sigma,1)).replace('.', 'p')
    file_name = 'gaussian_{0}_{1}.txt'.format(size, sigma_str)
    size = int(size)

    i,j = np.mgrid[0:size,0:size]
    i -= (size/2)
    j -= (size/2)
    g = np.exp(-(i**2+j**2)/(2*sigma**2))
    g /= g.sum()
    
    f = open(os.path.join(workspace, file_name), 'wb')
    f.write('{0} {0}\n'.format(size))
    for row in g:
        print '  {0}'.format(' '.join(['{0:0.6f}'.format(x) for x in row]))
        f.write('{0}\n'.format(' '.join(['{0:0.6f}'.format(x) for x in row])))
    f.close()
I then had to build kernel files for the Sobel Operators.
sobel_x.txt sobel_y.txt
3 3
-1 0 1
-2 0 2
-1 0 1
3 3
-1 -2 -1
0 0 0
1 2 1
Finally, I had to build 8 kernel files for shifting the raster 1 cell in each of the primary directions for the non-maxima suppression calculation. I used this approach instead of using the Data Management Shift Function so that I would get spatial analyst raster objects but I'm not sure it is actually better. The following are an example of the kernel files for the first 4 directions.
d0.txt d1.txt d2.txt d3.txt
3 3
0 0 0
0 0 1
0 0 0
3 3
0 0 0
0 0 0
0 0 1
3 3
0 0 0
0 0 0
0 1 0
3 3
0 0 0
0 0 0
1 0 0
The following code is my implementation of the Canny Edge Detector in ArcGIS 10.0. There are probably more efficient ways of doing it but this seemed to work. The intermediate files could be saved by adding a few temp_obj.save(temp_raster) calls. I didn't include all of my input checking and some names/values are hardwired.
import os
import arcpy
from arcpy import env
import arcpy.sa as sa
import numpy as np
def canny_edge(workspace, raster_name, filter_name, t_low, t_high):
    #### Set ArcGIS environment parameters
    arcpy.CheckOutExtension("Spatial")
    env.overwriteOutput = True
    env.workspace = workspace
    env.scratchWorkspace = workspace
    env.cellSize = raster_path
    env.extent = raster_path
    env.snapRaster = raster_path

    #### Hardwired file names for kernels
    sobel_x_path = os.path.join(workspace, 'sobel_x.txt')
    sobel_y_path = os.path.join(workspace, 'sobel_y.txt')
    d0_path = os.path.join(workspace, 'd0.txt')
    d1_path = os.path.join(workspace, 'd1.txt')
    d2_path = os.path.join(workspace, 'd2.txt')
    d3_path = os.path.join(workspace, 'd3.txt')
    d4_path = os.path.join(workspace, 'd4.txt')
    d5_path = os.path.join(workspace, 'd5.txt')
    d6_path = os.path.join(workspace, 'd6.txt')
    d7_path = os.path.join(workspace, 'd7.txt')

    #### Canny Edge Detector Workflow
    #### Apply gaussian filter
    filter_obj = sa.FocalStatistics(
        raster_path, sa.NbrWeight(filter_path), "SUM", "NODATA")

    #### Calculate 1st derivative in horizontal and vertical direction
    grad_x_obj = sa.FocalStatistics(
        filter_obj, sa.NbrWeight(sobel_x_path), "SUM", "DATA")
    grad_y_obj = sa.FocalStatistics(
        filter_obj, sa.NbrWeight(sobel_y_path), "SUM", "DATA")
    del filter_obj

    #### Calculate gradient and direction
    grad_obj = sa.SquareRoot(
        sa.Power(grad_x_obj,2) + sa.Power(grad_y_obj,2))
    angle_obj = sa.ATan2(grad_y_obj, grad_x_obj)
    del grad_x_obj, grad_y_obj

    #### Reclassify angle into four directions
    #### See http://suraj.lums.edu.pk/~cs436a02/CannyImplementation.htm
    reclass_remap = sa.RemapRange(
        [[-3.141593, -2.748894, 0], [-2.748894, -1.963495, 1],
         [-1.963495, -1.178097, 2], [-1.178097, -0.392699, 3],
         [-0.392699,  0.392699, 0], [ 0.392699,  1.178097, 1],
         [ 1.178097,  1.963495, 2], [ 1.963495,  2.748894, 3],
         [ 2.748894,  3.141593, 0]])
    zone_obj = sa.Reclassify(
        angle_obj, "Value", reclass_remap, "NODATA")
    del angle_obj, reclass_remap

    #### Non-maxima suppresion
    #### Shift raster 1 cell in each of the 8 possible directions
    #### This will get neighboring cells for non-maxima suppression
    #### Keep cells that are greater than the two neighboring cells
    ####  along the gradient direction
    #### Calculate separate raster for each primary direction then sum
    d0_obj = sa.FocalStatistics(
        grad_obj, sa.NbrWeight(d0_path), "SUM", "DATA")
    d4_obj = sa.FocalStatistics(
        grad_obj, sa.NbrWeight(d4_path), "SUM", "DATA")
    grad_04_obj = sa.Con(
        (zone_obj == 0) & (grad_obj > d0_obj) & (grad_obj > d4_obj),
        grad_obj, 0)
    del d0_obj, d4_obj

    d1_obj = sa.FocalStatistics(
        grad_obj, sa.NbrWeight(d1_path), "SUM", "DATA")
    d5_obj = sa.FocalStatistics(
        grad_obj, sa.NbrWeight(d5_path), "SUM", "DATA")
    grad_15_obj = sa.Con(
        (zone_obj == 1) & (grad_obj > d1_obj) & (grad_obj > d5_obj),
        grad_obj, 0)
    del d1_obj, d5_obj

    d2_obj = sa.FocalStatistics(
        grad_obj, sa.NbrWeight(d2_path), "SUM", "DATA")
    d6_obj = sa.FocalStatistics(
        grad_obj, sa.NbrWeight(d6_path), "SUM", "DATA")
    grad_26_obj = sa.Con(
        (zone_obj == 2) & (grad_obj > d2_obj) & (grad_obj > d6_obj),
        grad_obj, 0)
    del d2_obj, d6_obj

    d3_obj = sa.FocalStatistics(
        grad_obj, sa.NbrWeight(d3_path), "SUM", "DATA")
    d7_obj = sa.FocalStatistics(
        grad_obj, sa.NbrWeight(d7_path), "SUM", "DATA")
    grad_37_obj = sa.Con(
        (zone_obj == 3) & (grad_obj > d3_obj) & (grad_obj > d7_obj),
        grad_obj, 0)
    del d3_obj, d7_obj
    del grad_obj, zone_obj

    print "Sum Directions"
    nms_obj = (grad_04_obj + grad_15_obj + grad_26_obj + grad_37_obj)
    del grad_04_obj, grad_15_obj, grad_26_obj, grad_37_obj
    nms_obj.save(raster_path.replace('.img', '_nms.img'))
    
    #### Hysteresis Thresholding
    t_low_obj = sa.RegionGroup(
        sa.Con(nms_obj > t_low, 1), "EIGHT", "CROSS", "NO_LINK")
    t_zs_obj = sa.ZonalStatistics(
        t_low_obj, "value", nms_obj, "MAXIMUM", "DATA")
    del t_low_obj
    t_high_obj = sa.Con((t_zs_obj >= t_high), nms_obj)
    del t_zs_obj, nms_obj
    t_high_obj.save(raster_path.replace('.img', '_edges.img'))
There are probably issues at the edges of the rasters because I noticed that the Focal Statistics was not properly handling nodata using the weighting option.

Friday, August 19, 2011

Nan in NumPy array to NoData in ArcGIS raster

For smaller rasters, NumPy is a much faster alternative to some of the ArcGIS geoprocessing functions. One issue with doing this is that every raster must have exactly the same dimensions (rows, columns, cellsize, extent, etc.) since NumPy has no spatial component. Later on I will post about some of the techniques we use to make sure they are all identical.

There is one issue with converting NumPy arrays back to rasters though that I want to outline. The arcpy function NumPyArrayToRaster does not correctly convert NaN values. The values will display as NoData in the identity window, but they do not act as NoData in subsequent calculations and the 'Display NoData as' option in the Symbology tab has no impact.

To get around this, convert all NaN values to some unused value, and then set that value as the value_to_nodata parameter. For example:

test_array[np.isnan(test_array)] = -9999
test_obj = arcpy.NumPyArrayToRaster(test_array, pnt, cellsize, cellsize, -9999)
test_obj.save(test_raster)

This is Bug NIM063028 and is not fixed in any current or future version.

Pyramids documentation

There is a small issue with the documentation for the pyramids environment parameter. The explanation section for the pyramid_option has a few errors. The option is "PYRAMIDS", not "PYRAMID", and the default is to build full pyramids (-1 level), not NONE. This is Bug NIM062605 and is reported as fixed in 10.1. The example code at the bottom of the help page shows the correct usage. To force no pyramids I prefer to use: env.pyramid = "PYRAMIDS 0".

Also, the CreateRandomRaster function doesn't honor the pyramids setting. This is Bug NIM062606 and is not currently reported as fixed.

Wednesday, August 17, 2011

Offset of TIF rasters in ArcGIS 10, pre-SP2

In Arc10, pre-SP2, our Landsat TIF rasters were being shifted half a pixel away from where they should be.  For older Landsat 5TM imagery the thermal band could be up 60m offset in both the x and y.  ESRI fixed the problem in SP2 (Bug NIM060759) but in case you are loading TIF rasters with a pre-SP2 version of Arc10, it is easy to shift the rasters back to the correct location.  Since we were not using the panchromatic rasters from Landsat 7, all of our half cell sizes were integers (30m -> 15m, 60m -> 30m, 120m -> 60m), but that may be an issue for other applications.
half_cell_width = int(0.5*arcpy.Describe(landsat_tif).meanCellWidth+0.5)
half_cell_height = int(0.5*arcpy.Describe(landsat_tif).meanCellHeight+0.5)
arcpy.Shift_management(landsat_tif, shift_raster, -half_cell_width, half_cell_height)


It is also fairly simple to check which ArcGIS version and service pack you have:
version_str = arcpy.GetInstallInfo("desktop")["SPBuild"]
version_str = "10.0.2.3200"
sp_str = version_str.split(".")[2]
sp_str = "2"

MakeFeatureLayer lock files

When calling the MakeFeatureLayer function in a python script, a lock file is created that may persist even after the script finishes.  To remove the lock file, just call arcpy.Delete_management(layer) before the script terminates.

This forum post provided the answer to this problem.