Classes - Beard

You are here: start » ee490-11reports


|

Meta

This is an old revision of the document!


Reports / Documentation

Vision Recognition

vr_documentation.pdf

The quadrotor we are preparing for the challenge is required to operate in a GPS denied area. Without GPS, only a few tools are available to the quadrotor to calculate its location. The quadrotor will really principally on a laser scanner which will simultaneously scan its surroundings and build a map of its surroundings. This is done using a Simultaneous Localization and Mapping algorithm(SLAM). A key step of this algorithm is known as loop closure. As the quadrotor gets further away from its initial location, the uncertainty of its current position increases. This uncertainty can be greatly minimized if the quadrotor recognizes a location that it has been before. This allows it to update the entire map so that it becomes an accurate representation of the real thing.

Loop Closure

We achieve loop closure by regularly taking a sample picture from the video feed on the quadrotor and comparing it to a database. Instead of containing images, the database contains thousands of features that were found in sample images. The amount of computing power required to build the database is large, which makes it impractical to create the database from images that were captured on site, instead sample pictures are taken beforehand of places with similar surroundings.

SURF

The loop closure program works by first extracting features from an image. There are several different types of features that can be extracted, but the method that we used is called SURF, which extracts features that are scale and rotation invariant. These features are often abstract or very small so don't expect to understand what features the computer is seeing. For information on how SURF works we recommend you go to [http://www.aishack.in/2010/05/sift-scale-invariant-feature-transform/]. It explains the SIFT algorithm, which is the algorithm which SURF is based on. Our code is written in OpenCV, which already has a SURF implementation.

The SURF function outputs two vectors. The first vector is called 'imagedescriptors'. It contains a 64 bit description of each feature found in the image(making it 64 by N). The other output is a vector called imagekeypoints. This vector stores the location of each feature found on the image. To make it so the computer does not have to process every single feature (There will be potentially hundreds of thousands) The features in the images are grouped into clusters using k-means.

K-means

The features found in an image will not be evenly distributed. Instead they will be clustered around objects found in the image as shown in Figure 2. The K-means algorithm is a method of finding the location of the center of each cluster on the image. Each feature is assigned to a k-means center based on its proximity which is calculated using the Mahalanobis distance (eq. 1). We create a vector that contains the number of the center that each feature is closest to(we'll call it label). For example, say we have an image that outputs an imagedescriptors vector that is 140 features long. Each feature in this vector will be labeled in the second vector according to the center it is nearest to.

bow.zip

Indoor Navigation

Cooperative UAV/UGV

Laser Scan Matching and Odometry

1st Design Review

2nd Design Review

Final Design Review

Journals

Occupancy Grid Mapping using Laser Range Finder- Paul Bartholomew and Everett Bryan

Matlab files for building occupancy grids

build_world_map_from_cortex.m -Top level Design that uses cortex world positioning, and laser range finder scans to determine the obstructions that lie in a horizontal plane.

updateoccupancygrid.m -function to be used with build_world_map_from_cortex.m that will update a matrix that holds the values of grid based occupancy map

Some PowerPoint slides to show what the above code is able to produce laser_odometry_-_bryan_bartholomew.pptx

Occupancy Grid Figures that were created from the above code using 'laser_log_2011-3-1-17-24-40-605' scan files from cortex room: 75mmoccupancygridfigures.zip

ICP-Daniel Perry

Weekly reports on Progess: February 18th: This last week I spent my time trying to figure out how the icp algorithm works in MATLAB. I did this by taking some of the sample data that we have from the laser and read it into MATLAB. I then plotted the points along with the next set of points and rand the icp algorithm on them and I was not able to successfully get the output of the icp algorithm to match up with with the readings that we have from cortex. I spent more time reading about the algorithm online and I found a different version of the algorithm that I want to try out that will hopefully work better than the one that I have been trying to use. It is newer and supposedly can work in three dimensions. The readers comments on it are all positive but I have not had the time to implement it yet. If I can't get this one to work I am going to delve deeper into the icp algorithms to figure out how they work and if necessary try to write one of my own. I am hoping to avoid writing a new algorithm from scratch if at all possible.

February 28th: Accomplished last week: Last week I worked on the ICP algorithm. I think I finally got it to work. I wrote MATLAB script that will repeatedly read in two sets of data points, run them through an ICP algorithm, and then output a rotation and translation vector. This code will continually read in scans until we tell it to stop. Goals for next week: Determine how to get the rotation and translation vectors into a form relative to the UAV. Test the robustness of the ICP algorithm by doing a known rotation and translation on it to see how well the code actually works.

March 7th: This last week I was able to confirm that the ICP algorithm I have been working on works but I am still having problems converting the translation and rotation matrices to the UAV. I read the chapter in Dr. Beard and McClain's book on coordinate systems and looked online but I still have not been able to figure out how to get the data to map back to the robot correctly. I currently have MATLAB code that will plot the first laser scan and then use the successive scans to plot my projected location of the UAV. I have my projected location moving in circles but they do not correctly correspond to the data from CORTEX. I was hoping to be able to find someone this week who has more experience with changing coordinate systems and get their help correctly implementing one for our specific purpose.

March 14th: This week I concluded that the ICP agorithm does not work. I wrote up a document explaining why. It is attached. use_of_icp_for_uav_odeometry.docx Edit: I talked to Dr. Beard. He said that if I implemented a kalman filter with the ICP algorithm that it might still work. I went and talked to Raj about implementing a Kalman filter and he explained how to do it and it seemed to be fairly easy. I talked to Brice to see if he wanted me to try and he said that he was working on a different form of localization already and would prefer if I implemented a height finding algorithm. I did not end up trying to implement the Kalman filter with ICP. It might make ICP usable.

ICP MATLAB Files: esample.m -example file provided with code

find_bound.m -support code

fminlbfgs.m -support code

icp.m -main icp file

icp_finite.m -a different icp file i tried to implement, does not work as well as icp.m

icpreader.m -support code

icptest.m -code i wrote to test to see if the icp algorithm worked

lsqnormest.m -support code

movepoints.m -support code

myicp.m -me implementing the icp algorithm

myicp2.m -more of my clode implementing the icp

plotdatapoints.m -support code

quat2rmat.m -support code

rmat2quat.m -support code

rms_error.m -support code

Quad Trees-Sam Olson

Height Estimation - Daniel and Paul

heightestimation.m -Top level Code that reads in cortex data and laser range finder scans to determine the height that the flying vehicle should be above the objects below.

getcurrentheight.m -function to be used with heightEstimation.m to return the current height from the mirror reflected data that was returned by the laser range finder.

getheightaverage.m -function to average a specified number of scans so that the flying aircraft doesn't make any sudden jumps in altitude as it flies over obstacles.

We will also include our simulation file for flying up or down a flight of stairs stairtest.m

The following zip files are the figures that were produced from data extracted from the cortex room file 'laser_log_2011-3-26-11-10-24-504' while 'flying' the aircraft over obstacles as well as simulated figures for a staircase. heightestimationfigures.zip

Contour Mapping- Daniel, Paul, Sam

As a group, we started the semester by trying to create a matlab file that would create contour lines from a single laser scan by connecting the dots to the closest point. This worked to our basic understanding of the project, but the grad students who mentored us thought that it didn't have any relevant contribution to the project. For the possibility that contour line may be used in the future we have included our basic files here

laserscan.m – Top level, instantiates a simple Cartesian laser scan

parentchild.m – finds the nearest node from all the given points

getdistance.m – a function to find the nearest point, this function is used by parent child.

plotcontor.m – instead of using matlab's built in plotter, we developed this to draw single contours from the parent child array that is built in laserscan.m

Another useful file to convert a single scan from the laser range finder into a Cartesian map that can be plotted by matlab polarplot.m

AFRL Grant Proposal- Michael Root, Paul Bartholomew, and John Macdonald