~ PROJECTS ~
Autonomous Coatings Removal System (ACRS)Many ground and air vehicle systems require coatings removal periodically. Depending on the chemistry, this can be a very time consuming and even hazardous removal process for humans to endure. I have worked to alleviate this burden by helping the National Robotics Engineering Center with removal of coatings from aircraft autonomously as a part of the ACRS project. Because of the high tolerance of the laser delivery system used during coatings removal, accurate placement of the end effector relative to the aircraft is critical. Often, accurate modeling of the aircraft is not available. We have developed a system of scanning the aircraft and autonomously building a model that can be used for localization and planning. Once the underlying model is constructed, we can use it for highly accurate localization of the delivery system relative to the area of the aircraft about to be processed.
Underwater Mapping and LocalizationThere are many structures in the ocean that are subject to repetitious waves and currents. Inspection to ensure these structures have not fractured is imperative especially when the structures are oil rigs and platforms. The autonomous inspection of these oil rigs can reduce cost and human danger while also ensuring continuous safe operation. During part of my stint at Lockheed Martin I helped design localization and mapping algorithms for oil rig inspection. Using a 2D sonar collection, returns were processed to infer larger structures localizing to minimally available prior models of the rigs. This technique enabled us to not only localize, but also inspect for small defects, as-built differences, and other general change detections from ocean wash and debris.
Video Aided NavigationComputer vision is an attempt to use cameras and computer systems together to enhance understanding of the immediate surroundings. Recently I have had the opportunity to apply a visual odometry approach to both ground and air platforms carrying camera systems in order to provide this additional environmental understanding. We use a feature point tracker to track points in both color imagery as well as IR imagery. Using these point correspondences we are able to automatically fit motion models to the cameras which agree with the movements of the points in the image. These calculated motion models are able to extract the pose (position and orientation) of the camera system relative to the surrounding areas and we can track that pose over time. This odometry information is then fed into navigation software which can perform various path planning tasks which provides the robotic vehicle with an understanding of its surroundings and position in the world.
3D from VideoOne thing missing in the description above is any type of obstacle identification or avoidance. Using the poses found in the manner described above, 3D models of the scenes using only a video camera can be generated. The work here focused on indoor applications but is generally applicable to anything you can point a camera at. We take a video feed from a camera at arbitrary locations looking at a room, or objects in that room and generate a 3D model of that room and/or objects. This project is particularly challenging since most existing systems require the user to designate corresponding points between images which we would like to automate through feature tracking methods and video similar to that described above. It is also challenging because some systems which use stereo to perform the 3D point location calculation cannot be used since we do not have a calibrated baseline stereo camera rig, but instead we are using a common digital video camera. The approach is to track features while computing the relative motion of the camera for all the images. Once poses are known for a set of frames in the video we can use a variable temporal baseline approach to perform dense stereo reconstruction and acquire the 3D model of the scene.
Augmented RealityAugmented reality is the application of real time computer graphics to provide information to a user about an environment. For example, a user who may be performing maintenance on a manufacturing machine may benefit from detailed schematics or procedural overlays which would direct the maintenance operation. Another application is the use of informative overlay information for remote robotics operation.
Computed Tomography from Video FluoroscopyMy master's thesis work was in the computed tomography (CT) field, where we developed a method of performing a CT scan using data generated from a fluoroscopy machine which did not have controlled motion or external sensors to determine the motion. This is of particular impact to the medical field since CT requires knowledge of the data generation path for accurate 3D volumetric reconstruction. Until this point, this information could only be obtained using external sensors and/or precisely controlled motion.
GK12 Outreach - Adventure EngineeringAs part of an outreach program at the Colorado School of Mines, I have done some work with 6th, 7th, and 8th graders in the Denver Public School System to help provide a better understanding about what engineering is and how it is applied to everyday life. One of our goals is to show the students at a young age that it is important for them to perform well in their math and science classes and help them realize that the application of math and science (or engineering) is a fun and exciting career that they may like to pursue.