Introduction to robotics (for Cognitive Science)
1 hour lecture + 2 hour exercises at computer year 2022
1. Action. Actuators. Controllers. Robots.
Exercise: iCubSim touching ball on table
video 1
2. Kinematics: Direct and inverse.
Exercise: iCubSim kinematics
3. Perception. Sensors. The basic processing of sensor data: distance, camera image, depth map.
Exercise: iCubSim seeing ball on table via color filter
video 3
numpy library exercise
4. Control. Decomposition of the control system. Blackboard architecture.
Exercise: iCubSim taking and dropping, saying and moving with lips.
video 4
5. Computer vision without machine learning. The regular objects recognition.
Hough transform.
Exercise: Jetbot following a ping-pong ball
video 5
6. Phase correlation. Feature detectors (SIFT/SURF/ORB). Trackers (MIL).
Exercise: Robot following selected object via MIL or KCG tracker
video 6
7. The object type recognition: Haar, LBPH, DOT/HOG. Cascade classifiers and regressors. Gradient boosting.
Exercise: Robot following face.
video 7
8. Perception and action based on deep learning.
Exercise: Robot running deep learning models.
video 8
9. Cognitive approach to robot control. GOFAI, planning. STRIPS. Sussman anomaly. Frame problem.
Exercise: STRIPS and simulated SHAKEY
video 9
10. Post-cognitive approach to robot control. Emergence of control in modular control architecture. Brooks' subsumption architecture. Embodiment. Situated robots.
Minsky' society model of mind. Inspiration from Piaget's developmental psychology. Imitation performed by Robot.
Exercise: iCubSim interacting with human
video 10a
video 10b
11. Cloud technology for robots. Robot Pepper. Google Cloud. IBM Watson. MicroSoft Azure.
Exercise: Voice control of iCubSim via calling Google Cloud.
video 11
download pure-python-based motor control GUI for iCubSim (install PySimpleGUI)
Project proposals (inspire and invent your idea or select one from the list):
- iCubSim which recognizes the ball on the table, takes the ball from the table, moves it from hand to hand (Anton Škorec)
- iCubSim speaks according emotion of person in the front of camera. (Jelena Epifanic)
- Nico pointing to the grid fields (Hana Hornackova)
- iCubSim moves according to the pitch of the whistle
- iCumSim touching its head or the table according to voice command
- Control simulated ALLEN by voice (use Recognize4PC, e.g. call Google could for speech recognition)
- STRIPS controling simulated ALLEN to navigate to more places inside and outside of the room
- iCubSim moves both hands randomly and says "Au" when hits the table
- iCubSim which recognizes ball on table (one position in the exact front of the robot), takes the ball from the table, take it up and release
- iCubSim which recognizes positions of ball and hand. Randomly select joint and direction of move to put hand on ball by trial movements following feedback from the observed positions
- shape detection of white ball: use circle Hough transform and select only white object
- make photo of iCubSim head, print it out and rotate the picture in front of camera.
Let iCubSim moves its head to the left or to the right side according rotation of the image.
Employs ORB feature detector
- iCubSim which mimics movement taken by camera in more sophisticated way
- iCubSim folowing faces