Introduction to robotics (for Cognitive Science)
1 hour lecture + 2 hour exercises at computer year 2022 year 2023
1. Action. Actuators. Controllers. Robots.
Exercise: iCubSim touching ball on table
video 1
2. Kinematics: Direct and inverse.
Exercise: iCubSim kinematics
video 2
3. Perception. Sensors. The basic processing of sensor data: distance, camera image, depth map.
Exercise: iCubSim seeing ball on table via color filter
video 3
numpy library exercise
4. Control. Decomposition of the control system. Blackboard architecture.
Exercise: iCubSim taking and dropping, saying and moving with lips.
video 4
5. Computer vision without machine learning. The regular objects recognition.
Hough transform.
Exercise: Jetbot following a ping-pong ball
video 5
6. Computer vision without machine learning. The template-based objects recognition:
Phase correlation. Feature detectors (SIFT/SURF/ORB). Trackers (MIL).
Exercise: Robot following selected object via MIL or KCG tracker
video 6
7. Computer vision based on the classic machine learning.
The object type recognition: Haar, LBPH, DOT/HOG. Cascade classifiers and regressors. Gradient boosting.
Exercise: Robot following face.
video 7
8. Perception and action based on deep learning.
Exercise: Robot running deep learning models.
video 8
9. Large Language models
Exercise: Controlling the iCubSim robot with the LLM LaMini.
video 9
10. Cognitive approach to robot control. GOFAI, planning. STRIPS. Sussman anomaly. Frame problem.
Exercise: STRIPS and simulated SHAKEY
video 10
11. Post-cognitive approach to robot control. Emergence of control in modular control architecture. Brooks' subsumption architecture. Embodiment. Situated robots.
Cambrian intelligence
Exercise: Subsumption architecture controlling the simulated ALLEN
video 11
12. Cloud technology for robots. Robot Pepper. Google Cloud. IBM Watson. MicroSoft Azure. OpenAI.
Exercise: Voice control of iCubSim via calling Google Cloud and ChatGPT.
video 12
download pure-python-based motor control GUI for iCubSim (install PySimpleGUI)
Project proposals (inspire and invent your idea or select one from the list):
- Nico is drawing on the touchscreen a shown picture. (Sarah Marie)
- iCubSim moves according to the pitch of the whistle
- iCumSim touching its head or the table according to voice command
- Control simulated ALLEN by voice (use Recognize4PC, e.g. call Google could for speech recognition)
- Control simulated ALLEN (or JetBot) by hand poses (using the hand-pose detector) (Peter)
- STRIPS controling simulated ALLEN to navigate to more places inside and outside of the room (Alice)
- iCubSim moves both hands randomly and says "Au" when hits the table (Laura)
- iCubSim which recognizes ball on table (one position in the exact front of the robot), takes the ball from the table, take it up and release
- iCubSim which recognizes positions of ball and hand. Randomly select joint and direction of move to put hand on ball by trial movements following feedback from the observed positions
- shape detection of white ball: use circle Hough transform and select only white object
- make photo of iCubSim head, print it out and rotate the picture in front of camera.
Let iCubSim moves its head to the left or to the right side according rotation of the image.
Employs ORB feature detector
- iCubSim which mimics movement taken by camera in more sophisticated way
- iCubSim following faces and recognizing persons