by piotr5 » Thu Jan 03, 2013 12:15 am
really? terabytes per core? would be interesting then to have an arm-proceessor that uses 64-bit and has those terabytes available, so the coprocessor's cores could indirectly address that additional memory even though being still 32 bit themselves. quite a challange to implement the dma-transfer though, and price would be way beyond 100$ for the memory alone...
anyway, parallella is made by a person who worked at signal-processing. of course that will be the only thing it's good at! however, signal-processing is what our brain is doing all the time, with very little actual storage per neuron. so maybe those 99% HPC problems came into existence only because there was no actual low-mem signal-processing supercomputer? that's always the problem with using statistics for planning economy, you never know if the lack of some product didn't bias the statistics against that product's introduction! remember what someone once said about the invention of computers, that only a handful of them worldwide would be sufficient. now everybody has one...