Hadoop: Mapping binary files

Posted by restrictedinfinity on Stack Overflow See other posts from Stack Overflow or by restrictedinfinity
Published on 2010-06-10T07:38:34Z Indexed on 2010/06/10 7:42 UTC
Read the original article Hit count: 121

Filed under:

Typically in a the input file is capable of being partially read and processed by Mapper function (as in text files). Is there anything that can be done to handle binaries (say images, serialized objects) which would require all the blocks to be on same host, before the processing can start.

© Stack Overflow or respective owner

Related posts about hadoop