Monday 1 August 2016

LzoCodec Not Found: NiFi Accesses HDFS Issue

NiFi can read HDFS by GetHDFS, and write to HDFS by PutHDFS.

In both Configure, set /etc/hadoop/conf/hdfs-site.xml and core-site.xml and separated by comma.

Error:

java.lang.IllegalArgumentException:

Compression codec com.hadoop.compression.lzo.LzoCodec not found:

Actually, lzo is located in 
/usr/hdp/2.5.0.0-817/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-817.jar

Solution:

Check to see if the following line exists in “io.compression.codecs” property,
and if it does remove it from the “core-site.xml” file:

“com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec”

Or set LD_LIBRARY_PATH to include the location of the .so files for the LZO codec, before starting NiFi service

export LD_LIBRARY_PATH=/usr/hdp/2.2.0.0-1084/hadoop/lib/native 
bin/nifi.sh start

Reference:
https://community.hortonworks.com/questions/33774/puthdfs-processor-not-working-noclassdeffounderror.html

No comments:

Post a Comment