* Vision - A custom CNN using YOLO [3], where we are able to process a 256x256 (input is scaled) at 10fps to detect bounding boxes for balls and goal posts
* Localization - Kalman filter (mainly currently used for tracking rotation)
* Networking - Game controller (referee) [4], team communication [5] and a debug interface [6]
* Behaviour - A hybrid state machine
* Walking - Inverse kinematic walk with a balance system [7]
Feel free to ask questions. We plan to open source everything (everything) in a month to two months.
[3] https://pjreddie.com/darknet/yolo/
[4] https://github.com/RoboCup-Humanoid-TC/GameController
[5] https://github.com/RoboCup-Humanoid-TC/mitecom
[6] https://github.com/hellerf/EmbeddableWebServer
[7] https://github.com/Rhoban/IKWalk
EDIT: Bullet points on different lines