Categories
REU Experiences

Week 10

In the final week of the workshop, I put the finishing touches on my research poster and began to practice for my spoken presentation. The UNL College of Engineering hosted a practice research symposium on Tuesday, which was good practice for the following day. Wednesday’s official SRP symposium was great – I was able to see the work that other students in the program had put together.

In the following days, I put together a rough draft of my research paper to thoroughly summarize my project. Unfortunately, we couldn’t get the Jetson computers to work, but our research still turned out pretty well. This summer has been a lot of fun, and I learned so much. Thanks to the Nimbus Lab and the SRP program for the great experience!

Categories
REU Experiences

Week 9

At the beginning of our penultimate week, I frantically gathered my data and constructed my final poster. Monday and Tuesday were quite stressful, but I ended up with solid data and a poster submission. At our Tuesday meeting, Dr. Duncan and Dr. Detweiler looked through each of our posters and gave us some final advice, which was extraordinarily helpful. Today, I received confirmation that my poster was successfully printed, so I plan on picking that up this afternoon. This weekend, I will begin practicing my presentation for next week’s symposium.

For the second half of the week, Christina and I have been trying to resolve some of the issues that we’ve been having with NVIDIA Jetson Nano implementation. Getting the necessary packages installed (such as Keras, TensorFlow, Numpy, etc.) has been exceedingly difficult due to some unexpected errors. Beyond wrapping up our physical research, I’ve been organizing my data/code and started working on my final paper. I plan to have this paper completed by next week.

Categories
REU Experiences

Week 8

This week, some of my research went through some big changes. Pedro told me that using ResNet50 as the network architecture for my real-time drone detector would be about 11FPS faster than TinyYOLO on the NVIDIA Jetson Nano. For this reason, I quickly began to implement this new architecture. Furthermore, we decided to aim my research more toward data augmentation and transfer learning rather than regularization techniques. I also submitted the name of my poster: “Real-Time Unmanned Drone Detection: Optimization Through Data Augmentation and Transfer Learning.”

Over the weekend, we ran 16 different models on a computer with a GPU in order to reap the benefits of parallel processing on massive amounts of computations. I’m hoping that this will result in some good data for my poster presentation. Over the next couple of days, I plan to implement the best of these models onto the Jetson and put it onto a drone for real-world testing.

Beyond the poster presentation, I plan to spend the last week on active learning, which should drastically improve the model’s performance mid-flight. From this, I expect that my research paper will tie together nicely.

Categories
REU Experiences

Week 7

This week, I began to work with YOLO, a real-time object recognition system. The open-source implementation that we were provided is a pretty general framework, so I decided to optimize its performance by implementing L2 regularization and early stopping.

Afterward, I wanted to expand the existing drone image data set, which consisted of only 350 labeled training images. In an attempt to make this data set more robust and to diminish the risk of overfitting, I created a script in Python that aids the data augmentation process – fashionably named AAIL (“Automated Augmented Image Labeler”). This program receives a labeled dataset, applies various random transformations and rotations to the images, deconstructs each of these alterations, and generates new labels for the altered images. With this, I was able to drastically expand our dataset without taking hours to hand-label each image.

At this point, I decided to start gathering data. With this new, giant dataset, it took about 12 hours of training time for our network. Unfortunately, the network didn’t perform as well as I anticipated, as it had a very low average precision over the testing data. Pedro, one of my graduate mentors, suggested that we use a method called “transfer learning,” which would make the training process much quicker and could garner better results. I’m planning to try and implement this technique in the next couple of days.

By the end of the week, I was able to put together the progress I’ve made into a rough draft of my research poster, which I’ll be editing as my research continues.

Categories
REU Experiences

Week 6

This week, Christina and I implemented a Convolutional Neural Network in TensorFlow and Keras to recognize the sign language alphabet. After four epochs, our accuracy reached ~91% on the testing data. With a more extensive/diverse dataset, the use of more regularization techniques, and training longevity, it’s likely that this number could reach somewhere in the upper nineties.

Afterward, we moved on to real-time object detection. Ji Young helped us get started with YOLO (“You Only Look Once”), an object recognition system that operates in real-time and, in many ways, outperforms competing algorithms (including the historically popular R-CNN). With this algorithm in mind, we began to work on our final project, which is centered around real-time drone-to-drone detection. After Christina found a dataset of hundreds of labeled images of drones, I wrote a short Python script which retrieves and formats some important data from each image/label to write to a file, which we then feed into YOLO. After doing so, I began to train the data set.

Looking forward, I plan to implement some regularization techniques into the YOLO network that we’re using for our data. Since a large focus of my literature review is the optimization of deep networks to improve their ability to generalize new data, I believe that it would be beneficial to incorporate this into our final project.

Categories
REU Experiences

Week 5

This week, Christina and I finished the third TensorFlow Jupyter notebook assignment, which covered regularization techniques in deep neural networks. We implemented L2 regularization and early-stopping into the MNIST digit classifier, resulting in a trained model that is less overfit and can effectively generalize new data.

After receiving several helpful comments on the first draft of my literature review by our faculty and graduate mentors, I completed a second draft. In general, a lot of the problems with my submission resided in a lack of detail and the flow of each section. To amend this, I made sure that the “Network Optimizations” section was completely independent from the succeeding section “Challenges.” Inside of “Challenges,” I added a new subsection to address the Vanishing Gradient Problem. With this change, among others, I was able to add more detail into each section and the paper began to have a bit more flow.

On Tuesday, Ji Young gave a short presentation on convolutional neural networks and provided us with Nvidia Jetson Nano development kits, which we will be working with in the coming weeks. With these small, powerful computers, we plan on implementing a complex classifier onto a drone.

On Wednesday, we attended a workshop which covered graduate school applications and effective ways to write a personal statement. I found this to be very helpful since I’m beginning to look at graduate programs.

Categories
REU Experiences

Week 4

This week, I was primarily focused on writing the first draft of the literature review. A large portion of time was spent reading and annotating papers that are relevant to our research. Without much guidance, a lot of the writing process consisted of trying to figure out what we were expected to put together for submission. Trying to find commonalities between the papers was a bit of a challenge, but I found that network optimizations and challenges were two very prominent themes within the writing. Overall, I found the process of writing a professional literature review to be very rewarding and I hope to get more experience in the future.

At the end of the week, we were able to finish up most of the 3rd assignment in the Jupyter notebook with Google Colab. This assignment had us incorporate regularizers and “early stopping” methods in order to have a model that generalizes well on testing data, without over-fitting to the training data. TensorFlow is still a bit difficult to understand, especially since we spent a lot of the week on the literature review, but it’s coming along. We have been working with the MNIST database of hand-written digits, but next week we plan to move on to the CIFAR database, which is a larger dataset with more classes.

On Wednesday, we participated in a mock symposium workshop, which helped us understand some of the expectations for the poster competition. We were also familiarized with the judging criteria, effective ways to communicate research, and how to put together a visually-pleasing poster.

Categories
REU Experiences

Week 3: A Different Kind of Networking…

… and I’m not talking about the time when my freshman year residential advisor tried to add me on LinkedIn. Instead, I learned about neural networks and professional networking.

At the start of the week, Christina and I were introduced to TensorFlow, Google’s open-source library for machine learning. After learning some of the fundamentals of neural networks (including backpropagation, gradient descent, softmax, ReLU, and so on), we began working through a couple of Jupyter Notebook assignments to get familiar with TensorFlow. We quickly discovered that a lot of TensorFlow’s power is almost entirely found under the hood, which was a piece of control that neither Christina nor I wanted to give up. Having dealt directly with the math, learning processes, and logic of the perceptron classifier, we were left with an insatiable hunger for understanding every detail of the heavy-lifting that TensorFlow was doing behind the scenes. We eventually came to realize that this desire was a bit unreasonable for the context of our research, and so we began to trust the power of TensorFlow. With time, we were able to complete both of the assignments and implemented the processed webcam feed from our previous assignment for fresh testing data.

Afterward, we began reading some research papers in preparation for our literature review. The first paper I read was titled “Deep Learning,” which led me to read a cited work titled “ImageNet Classification with Deep Convolutional Neural Networks,” in addition to some blog posts by Adit Deshpande. Each of these works aided my understanding of convolutional neural networks and their applications.

Our workshop this week was on professional networking, which taught us how to find connections within an industry and how to exhibit professionalism.

Categories
REU Experiences

Week 2

At the start of the week, my lab partner, Christina, and I finished up our program for creating a decision boundary for a linearly separable dataset. We incorporated Matplotlib into our code for the graphing of our solution. Afterward, we began working on a 10-perceptron classifier (with each perceptron representing a single digit from 0 through 9) in order to classify handwritten digits. We trained our perceptrons for 10 generations with the MNIST database, which contains 42000 unique images. After training, our percent error reached ~18.25%, which is about what were expecting for a perceptron-based learning algorithm.

Our next challenge was to use an external USB camera with ROS to have our python script subscribe to the camera’s publisher stream. With this implemented, we were successful in pointing the camera at a handwritten digit, and our algorithm, for the most part, was able to classify the digits correctly. Some numbers were incorrectly classified due to looking similar (such as 5 and 6) and others due to possible deviation in the dataset (such as 4 and 1, both of which can be written in various ways).

At the end of the week, the graduate students introduced us to a more complex neural network for machine learning. We plan to work more with Jupyter and TensorFlow next week.

Aside from our lab work, we attended a scientific writing workshop on Wednesday where we learned about the importance of scientific writing and how to effectively present and communicate our research.

Categories
REU Experiences

Week 1

            My name is Chris and I’ll be a 3rd-year computer science major at the University of Minnesota – Twin Cities this fall. I am originally from Lincoln, NE, and I try to visit my family in Lincoln as much as possible. At the U of M, I work as a TA for the introductory computer science course. I’ve found it to be very rewarding, and I plan on working as a TA for a different class this coming semester. I’m also pursuing a music minor, with a primary focus on jazz performance.

            At the REU program, I am a member of the Unmanned Systems lab. We attended several presentations during orientation, each related to a specific important factor of grad school and research in general (including financial literacy, authorship, mentorship, etc.). Over the past week, we have been introduced to the Nimbus lab space, the faculty, grad students, and more. At the start, we worked through an extensive ROS (Robot Operating System) tutorial, learned how to fly Hubsan drones, studied for the Part 107 drone certification, and became more familiar with the people and environment around us.

At the end of the week, we were grouped with a faculty member and grad students to get started on our individual research projects. I will be working with Dr. Detweiler and two graduate students, Pedro Albuquerque and Ji Young Lee, on a project involving machine learning and neural networks. Last Friday, my lab partner and I were taught about single-layer decision boundary perceptrons, used to define a line that splits a linearly separable data set. I’m very excited to learn more about this subject because my primary area of interest is artificial intelligence.

By the end of the program, I hope to garner the necessary skills to do great research in graduate school and in my future career.