First Parallella User Created Video

One of the most active Parallella forum members “@shodruky” received a board a few days ago and check out what he did! Simply Amazing!
@shodruky played around with the board to try to reproduce what we did for the Kickstarter demo video last year. We still have some tuning left to optimize performance, but it’s not a bad start. We have a computer!

[youtube=http://youtu.be/ZYUNBRiqpkA&w=480&h=360]

 

Combining the Parallella with a $25 Raspberry Pi makes so much sense. Each board performs the job it does best. Parallella crunches numbers and the Raspberry Pi handles the graphics. Heterogeneous computing at its finest!

[youtube=https://www.youtube.com/watch?v=-lCWlu1EnsM&w=480&h=360]

14 Comments

  • Is the Parallella (in theory) able to handle graphics just as well as the raspberry pi?

  • Al Thomas says:

    In theory? No. In practice? Also no. There is no GPU on Parallella.

  • Jessie says:

    You new at reading Marcel? The Parallela has no GPU.

    • Marcel Gommans says:

      If I am correct, the Parallella comes with a hdmi-connector. In one way or the other, it is capable of showing graphics on a screen. If there is no way to compete with the Pi using this connector, the Parallella is not for me. No matter how you put it, the Parallella has a GPU, even if combined with a CPU.

      • Al Thomas says:

        Well you get a full GUI with linaro-ubuntu-desktop but it would just be a (unaccelerated) X server on the Zynq CPU cores versus hardware acceleration on anything with a GPU, embedded or discrete. A more interesting demo would be gl_server tiling multiple Parallellas into a single GPU-enhanced board. The RPi Fast ethernet port might not be up to that. And it still doesn’t exercise the Epiphany chip… hmm, SGEMM in OpenGL across a socket? Where is my ATI programming reference from 2000 when I need it. It does beg the question though; is there RPi GPU OpenCL support? Probably not…

  • Alan Campbell says:

    So, judging by the comments here, the HDMI on the current board is “good enough” to draw text on a screen? It’s just fancy graphics that it can’t handle?

    Sounds like a job for… an extender board. Throw some in audio I/O while you’re at it, and call it an AV extender. If you want fancy [but expensive] tack on a huge LCD screen while you’re at it, like the 7″ screen available for the Beaglebone Black.

    Question: has anyone started designed extender boards for the Parallella yet?

    • Al Thomas says:

      Well you can get a graphical display off of the Zedboard part (Zedboard = Parallella – Epiphany) but it is pretty anemic… nothing like the 1080p demo vid linked in this article. If graphical display is your priority, rather than just a console, then yes, you will need a second board right now. As you suggest it would be way better to talk to some board like an RPi across e-link rather than ethernet. But what would be even better is if we could replace the Zynq-7000 with some “equivalent” die with a GPU in it, like the Samsung Exynos 4412. I have to figure there is some issue with the reconfigurable logic that requires us to use an FPGA rather than a hard ARM processor. The original Parallella dev boards were daughterboards on a Zedboard. The Zedboard has all sorts of stuff on it including audio.

      • Mert says:

        Well, even though Parallella is a good board for multi-board and parallel usage, there are other “closer budget” solutions out there. Go check out ODROID boards, one might be fit to your needs. Or perhaps “Radxa Rock Dev Board” with “Mali400-mp4@533Mhz, OpenGL ES 2.0” or even better Wandboard with amazing Vivante GC 2000 + Vivante GC 355 + Vivante GC 320 chips all together!

  • Bhushan Gawnade says:

    can u tell me the overall working,use,specification,architecture of parallella and what is main advantage of parallella,so please forword it on my email-id
    bpgawande@gmail.com

  • Gael N says:

    While the Parallella doesn’t have a GPU, it is meant to do parallel computing like a GPGPU. Couldn’t a driver be written that would treat the Epiphany chip like a GPU?

Leave a Reply