Hi,
I'm working on a mobile robotic camera platform / processing station. Currently positioning my DSLR using motors, configuring/triggering exposures on via USB, and using accelerometers/PID control for closed loop control - all driven from a lowly AVR (+ USB host controller) controlled via a UART, or the accelerometer in an Android phone (via BT). I have only done rough prototyping of the robotics, and have a ways to go before I have any sort of stable platform I can publish design files for. Only then can I begin to look at mechanical zoom control and so forth.
The Parallella is the perfect fit for the processing brain, being a low power, highly customisable processing platform (I am interested in both run-time reconfiguration of the FPGA for highly tuned operations, as well as the Epiphany as a more general purpose co-processor). The porcupine is what I need to have the Parallella become the single brain to control the robotics platform (via the AVR) as well as control/configure exposures, retrieve and process images (via USB).
I daresay there will be overlap with the image processing side of advanced UAV/gimbal style projects moving to the Parallella. The robotics part itself could be used for any number of applications involving camera control and image processing, or guided video capture. My interests are real-time target tracking/acquisition (eventually - requires additional instrumentation, of course), exposure bracketing, HDR, panorama capture/stitching, GPS tagging (since my DSLR lacks it, but my Android device doesn't) - and programmed tracking / intervalometer type tasks, such as tracking stars for stellar photography.
This is a hobby project - I have limited time/finances, and progress will be slow - but if you can help out, and there is interest, I will eventually create a tech blog of sorts to share my progress online (and of course plug the Parallella at my local robotics club
Regards,
Yani.