My name is Andre Carrera, I am the founder of Lambdo.
For the next 4 months (February - May 2020) I'll be part of the 3rd cohort of the OpenAI Scholars program. I ran into this program by coincidence last year. I took Dr. Wingate's deep learning class my last semester at BYU, and I as I was exploring opportunities in the field, I ran into a blog post for the program. I quickly applied, got through some interviews and here I am.
Every two weeks I will write about my experience. I'll try to be as detailed as possible and hopefully these blog posts will be useful for anyone interested in getting into the field.
I really believe self-education is a legitimate option for anyone interested in Deep Learning, or in development in general. Shorty after starting my university experience in CS, I realized that I wasn't going to be taught skills that would be applicable in the real world. It might just have been my university, but I felt that there was too much theory, too much busy work and not enough application. I only found a few of classes to be useful thus far, and I could fit all of those into a semester.
So, I took measures into my own hands and challenged myself to learn those skills. I did that by taking on software development projects. I took on projects not knowing how to use the technology but being fully confident that I would be able to learn. Lucky for me, that turned out to be the case. I found that having real projects with real goals and real outcomes was extremely helpful for my education. Not only was it good motivation to learn, but I was also learning real marketable skills.
I see the OpenAI scholars program as another opportunity to learn valuable skills. I'm lucky enough to create my own curriculum and work on my own open source project. I'm especially excited about the opportunity to rub shoulders with experts in the field. As part of the program I created a syllabus that I'll update throughout the program attached below.
I decided to use the first two weeks to explore, find interesting papers, learn some of the foundations and get a feel for how fast to approach things. I started out by going through Ian Goodfellow's Deep Learning Book and reading some papers on my syllabus. After a day of that I realized I couldn't passively take in so much information. I'm used to engineering where there is an objective and any reading and learning is quickly applied. So I decided to change my strategy a bit. I'm going spread out the hard technical reading, and focus on more application and implementation.
I spent the week building an understanding of self attention and transformers, since the next few weeks and my final project will depend on that understanding. For anyone one interested in those topics The Illustrated Transformer was extremely valuable.
I also decided that throughout the project I'm going to contribute to Swift for Tensorflow. In fact this week I started working on a pull request to implement an example transformer for translation, it'll be based on the annotated transformer. I'll be posting links once it's done.
Next week I plan to work Conversational AI with Transfer Learning. I'm interested to see what kind of result I can get with transformers and what other baselines I can implement.
If you have any questions, please feel free to reach out to me!