First of all, feel free to join in on our weekly meetings! Mail us for the zoom links.
- general & talk: Friday 11:45am – 1pm
- vr meetings: Friday 11am-11:30am
- language meetings: Thursday 12pm-1pm
Second, read some of the following papers.
What do you like about them? What would you be more interested in focusing? Perhaps you don’t like that goals can be defined in a single way. Perhaps you want to think more about how we can represent the world and add a vision module to it. Let us know exactly what you want to invest your time in!
Third, do the tutorials [disclaimer: you don’t have to do all of them if you want to focus on one you really like right away, but if you are unsure we would suggest trying one of each type~]:
- Getting started with Magic Leap and Unity: https://creator.magicleap.com/learn/guides/unity-setup-intro
- ROS Overview: http://wiki.ros.org/ROS/Introduction#:~:targetText=ROS%20is%20an%20open%2Dsource,between%20processes%2C%20and%20package%20management.
- Create a ROS package: http://wiki.ros.org/ROS/Tutorials/CreatingPackage
- Simple Publisher/Subscriber: http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber%28python%29
- Simple_rl is framework for reinforcement learning and other types of sequential decision making agents. This is where you can play around with simulation of little agents in a gridworld. You will need to understand (maybe implement) the Value Iteration algorithm. You should also look read chapter 17 of Russell Norvig (the AI textbook).
- PyTorch: PyTorch is a deep learning framework like Tensorflow. Think of it as a package used in Python to create neural networks. Let us know if you need help with installing anything (venv, etc.)
- ZeroToAll: (Deniz’s favourite) After you watch these videos and you go through the tutorials implement the I-DRAGGN model in this paper [http://h2r1.cs.brown.edu/wp-content/uploads/karamcheti17.pdf]
- Seq2Seq: You could also learn how to implement a seq2seq and then implement the model in this paper [https://arxiv.org/pdf/1905.12096.pdf]
- PiDrone: PiDrone is an autonomous raspberry pi/Python drone developed by the lab. Each year (traditionally, Fall semester) Stefanie runs a class (CS1951R) where every student builds their own drone and learns about the underlying autonomy. You can read papers about the drone here.:
- Software stack (see README.md): https://github.com/h2r/pidrone_pkg
If you need any help with them we encourage you to drop by the lab (knock on the door) and introduce yourself, and work from the lab. Most people who are around are friendly, willing, and able to help! You can also reach out to the mailing list h2r at cs.brown.edu or:
- firstname.lastname@example.org AR/VR – ROS – simple_rl related tutorials
- email@example.com PyTorch – simple_rl related tutorials
- firstname.lastname@example.org PiDrone – ROS – simple_rl tutorials
Did these feel like interesting problems to solve? Did you like the coding side of it? Do you feel like you could make use of these frameworks in your own dream project?
If you do these and you enjoy doing it, we can think about the next step and help you join the lab officially. If you decide you would like to join in on a certain project or perhaps start your own research group please talk to Stefanie (our PI!) who can be reached at email@example.com. If you can’t reach out to her you can also contact the members mentioned above. Our goal is to get you paired up with a student who is already in the lab, and attending one of our weekly project-specific meetings. You will start out by helping with their project, and then move on to your own project.
~ Extra Resources ~:
- Deep Learning (more mathy): https://www.deeplearningbook.org you can also get a Kindle on Amazon but this link is free!
- Understanding LSTM Networks (recurrent networks) https://colah.github.io/posts/2015-08-Understanding-LSTMs/
- Word Embeddings Explained: 1, 2, 3, 4.
- Seq2seq attention visualization (PyTorch diagram is not as clear): http://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/