[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 483: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 112: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 112: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 112: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 112: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 112: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/functions.php on line 4688: Cannot modify header information - headers already sent by (output started at [ROOT]/includes/functions.php:3823)
[phpBB Debug] PHP Warning: in file [ROOT]/includes/functions.php on line 4690: Cannot modify header information - headers already sent by (output started at [ROOT]/includes/functions.php:3823)
[phpBB Debug] PHP Warning: in file [ROOT]/includes/functions.php on line 4691: Cannot modify header information - headers already sent by (output started at [ROOT]/includes/functions.php:3823)
[phpBB Debug] PHP Warning: in file [ROOT]/includes/functions.php on line 4692: Cannot modify header information - headers already sent by (output started at [ROOT]/includes/functions.php:3823)
Parallella Community • View topic - TensorFlow

TensorFlow

Forum for anything not suitable for the other forums.

TensorFlow

Postby dobkeratops » Tue Nov 10, 2015 9:11 pm

http://www.tensorflow.org

would google's recently open sourced TensorFlow machine-learning related framework be a good fit for the Epiphany architecture?

It seems to deal with data flow graphs related to machine learning, and currently has CPU and GPU implementations, and works on multidimensional data. I suppose each processing node would translate into a group of cores on the Epiphany grid, DMA'ing results between them without having to touch external memory.. seems a perfect fit.

Is anyone looking into these types of workload; Given the profile of Google, and AI, would it be a good avenue to get demand and users for the epiphany architecture.

There's been a lot of talk of new hardware emerging for AI and I would personally much rather see something versatile like the Epiphany eating into GPGPU (and capable of vertex-processing in graphics) rather than dedicated neural hardware tailored to one specific usecase.

I suspect this is something the Epiphany could do better than GPUs, and its' all the rage at the minute.
Last edited by dobkeratops on Tue Nov 10, 2015 11:35 pm, edited 2 times in total.
dobkeratops
 
Posts: 189
Joined: Fri Jun 05, 2015 6:42 pm
Location: uk

Re: TensorFlow

Postby aolofsson » Tue Nov 10, 2015 9:21 pm

Looks great! Thanks for posting this, quite CUDA centric, but very impressive.
http://download.tensorflow.org/paper/whitepaper2015.pdf
User avatar
aolofsson
 
Posts: 1005
Joined: Tue Dec 11, 2012 6:59 pm
Location: Lexington, Massachusetts,USA

Re: TensorFlow

Postby 8l » Thu Nov 12, 2015 1:13 pm

8l
 
Posts: 173
Joined: Mon Dec 17, 2012 3:23 am

Re: TensorFlow

Postby piotr5 » Fri Nov 13, 2015 4:49 pm

if it were not for parallella I'd be into cuda and whatever gpgpu. but truth is, if it comes to mathematics, simd processors are useless beyond anything linear. tensor of course sounds linear, so I can understand it's been written cuda-centric. but afaik it doesn't need to be. so before anybody goes off to buying the jtx1 board, how many independent instructions can it actually perform, how many wave-fronts are possible simultanously, really 256? financial predictions have been centered on linear approximation, and as a result we got the financial crisis. you want your deep-learning approach to go down the same drain? for example neuronal networks are often simulated by linear transformations. if each neuron has 1000 connections, this approach needs 1000 times the amount of memory and computation power a non-linear approach would need. just because literature is cuda-centric doesn't mean there is no better approach, especially if your simd systems have less than 1000 execution cores per wave-front you might try to think outside of this linear-algebra-system box. bet that's what dobkeratops meant with tensor-flow being a good fit for epiphany...
piotr5
 
Posts: 230
Joined: Sun Dec 23, 2012 2:48 pm

Re: TensorFlow

Postby dobkeratops » Sun Nov 15, 2015 5:16 pm

the reason I've posted this: it seems tensorflow allows expressing computations as data-flow graphs - which is a model more suited to the epiphany's unique features than OpenCL, even if the the tensors themselves are 'linear'.

GPU's are successfully running image recognition and other tasks - e.g. convolutional neural networks.
Values (filter weights and images) are read in and can be processed in parallel - then written to a temporary, before handing to the next layer:
They're' working fine because the 'writes' can still be buffered on chip in the caches, and passed from stage to stage. One stage (an entire layer of neutrons) can wait for the previous to complete. With convolutions, there is vastly more 'read' work than 'write', and there is a well defined seperation between the layers.

The hope is the Epiphany architecture can do this better? .. by explicitely reasoning about on chip communication for the temporaries, perhaps having smarter ways of handling sparsity (in turn leveraging the greater divergent flow its' capable of ), and perhaps it will provide more scope for complex nets (recurrent etc..) .

Also note that 'tensor flow' has been opened to the community in part to get it to evolve - there's a chance to steer it.

The point is software and applications are complex, and Epiphany needs software to become popular. If there is a common portable toolset with more potential to exploit it than OpenCL , more people will consider the epiphany architecture and in turn we'll be more likely to see the 1000+core versions mass produced. Imagine if you could get Google interested - they could put in a huge order and we'd get the chip we want.

There is the chicken/egg situation - until epiphany gets' software, the chips wont be mass produced, and the GPU solutions like the TX1 will remain superior.
dobkeratops
 
Posts: 189
Joined: Fri Jun 05, 2015 6:42 pm
Location: uk

Re: TensorFlow

Postby piotr5 » Mon Nov 16, 2015 9:59 am

imho tensorflow is too much general purpose to support parallella architecture. it contains commands which ask for the type of architecture you have, and an ignorant programmer will make it so that epiphany never is used.

as for AI, I'm not sure multiplying each signal with a weight is the way to go for neuronal networks. the idea of using convolution sounds good, but imho you then should work with function-objects and not with discrete weights and thresholds. currently I'm interested in sound-encoding, and there showed up an interesting observation: while for images the fourier-transformed data is stored (i.e. frequencies are split into 3 wave-lengths called "colours") sound usually is stored as individual wave-dots. our ears aren't capable of distinguishing offsets in sound-frequencies! the idea in sound-compression should be to handle sound-frequencies as smooth waves, extract their wave-length and store that to conserve space. i.e. fourier-transform the fourier-transformed, as is also done with image-compression. (that's open-source-implemented in codec2.) now, couldn't neuronal networks be compressed in the same way? instead of storing 1000 weights, store a weight function, sort the links in an optimal way. i.e. during learning-phase you use individual weights, and then for the user's sake rearrange the node-numbering in a way that the resulting weights can be stored as functions for values ranging from 0 to 1000 mapped to the weights and thresholds and whatever you need, all in as little space as possible. this way you can handle 1000 connections per neuron with very small memory-requirements, and you get sparse matrix handling for free. as you said, it's lots of reads and little writes, so reduce the reads! brain attempts to work in a similar way I believe, except that already in learning-phase misfiring connections will cause neighbouring connections to end up at a similar behaviour, so that later signals going off to the wrong connection will have about the same results too...
piotr5
 
Posts: 230
Joined: Sun Dec 23, 2012 2:48 pm

Re: TensorFlow

Postby 8l » Mon Nov 16, 2015 1:16 pm

8l
 
Posts: 173
Joined: Mon Dec 17, 2012 3:23 am

Re: TensorFlow

Postby dobkeratops » Mon Nov 16, 2015 6:28 pm

dobkeratops
 
Posts: 189
Joined: Fri Jun 05, 2015 6:42 pm
Location: uk

Re: TensorFlow

Postby piotr5 » Tue Nov 17, 2015 12:40 pm

and yet again AI boils down to graph-theoretical transformations. you know, if your main processor is the bottle-neck with its calculations of how to spread the work among available ressources, while everybody else is waiting for the data, I'm not really sure you could call that parallellization. suppose you do have neuronal network with limited number of connections per neuron, this forms a graph, some data flows in parallell along the same lines, some goes through delay routes to arrive together with later data. the effect of the latter is much like markhoff chains. the former is a bit like permutating the data and pattern-recognition based on that. i.e. it's all just algebra in some way. so why actually trouble yourself with neuronal networks when we have much more versatile and much better understood tools available in computer-algebra? as I remember they tried once to emulate the brain of a fly to predict its flight-path. then they set out a real fly and it turns out the neuronal network failed to predict the fly's movements. religious people would see in that a proof that the fly has a soul, I see it as a proof that the model of neuronal networks is inaccurate. so, let's stick to what we know best, logic!

as for quantum computing, we all have seen http://www.cqc2t.org/ but I doubt this will have much impact in my lifetime. quantum computers still require incredible cooling and even then they wont work reliably. more promising sounds the technology for charging and energy-transfer through quantum-effects, data-transfer with quantum effects could allow us travel to other planets in that remote-controlling becomes possible. also purpulsion using such technology sounds interesting. but reading extreme tech is a bit like day-dreaming, they present no actual scientific data. fact is, computers nowadays do make use of quantum-effects already, it's ordinary technology advancement. what is needed for quantum-computers are 3d printers which build something atom by atom, all at the lowest possible temperatures so no atom escapes before it's finished. and even when we have that, logic tells us quantum computers will have turing-degree smaller than the next turing degree after the one of our computers. i.e. the halting problem still is unsolved by them! so you still are stuck in the problematic situation that you cannot always tell if some program will terminate execution or not...
piotr5
 
Posts: 230
Joined: Sun Dec 23, 2012 2:48 pm

Re: TensorFlow

Postby dobkeratops » Thu Nov 19, 2015 6:59 pm

dobkeratops
 
Posts: 189
Joined: Fri Jun 05, 2015 6:42 pm
Location: uk

Next

Return to General Discussion

Who is online

Users browsing this forum: No registered users and 12 guests

cron