Now that we have a basic inverse kinematic model, the sky's the limit. Wait. The horsepower of a Raspberry Pi Model A is a pretty good limit. In this video, the eye vector is tracing out a circle, trying to hold 1 second per circle.
Unfortunately for us, forward kinematics doesn't let our robot control itself. To determine the servo positions that achieve a robot position, we need inverse kinematics.
The process of determining the state of the robot from the control inputs is the forward kinematics. For a given set of 4 servo positions, we should be able to predict the orientation of all parts of the robot.
Let's make a desktop robot that moves. But wait, we have no tools.
It's been a while since I've done any robotics, so let's make that happen. When designing an autonomous system, it's critical to have clear objectives so that the requirements for end effectors, sensory inputs, and processing resources can be achieved inside your cost and time budgets. Instead of that, let's try to settle an office debate: what are the kinematics of a robotic platform positioned by 3 control rods?
Then, we'll put eyes on it. And it needs to fit on my desk; my apartment is not very large.