Avro : java.lang.RuntimeException: 记录中不支持的类型

Avro : java.lang.RuntimeException: Unsupported type in record

输入:test.csv

100
101
102

猪脚本:

REGISTER required jars are registered;

A = LOAD 'test.csv'  USING org.apache.pig.piggybank.storage.CSVExcelStorage() AS (code:chararray);

STORE A INTO 'test' USING org.apache.pig.piggybank.storage.avro.AvroStorage
    ('schema',
    '{"namespace":"com.pig.test.avro","type":"record","name":"Avro_Test","doc":"Avro Test Schema",
        "fields":[
            {"name":"code","type":["string","null"],"default":null}
            ]}'
    );

存储时出现运行时错误。关于解决相同问题的任何输入。

错误日志:

ERROR org.apache.pig.tools.pigstats.SimplePigStats - ERROR 2997: Unable to recreate exception from backed error: org.apache.avro.file.DataFileWriter$AppendWriteException: java.lang.RuntimeException: Unsupported type in record:class java.lang.String
at org.apache.avro.file.DataFileWriter.append(DataFileWriter.java:263)
at org.apache.pig.piggybank.storage.avro.PigAvroRecordWriter.write(PigAvroRecordWriter.java:49)
at org.apache.pig.piggybank.storage.avro.AvroStorage.putNext(AvroStorage.java:722)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:139)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat$PigRecordWriter.write(PigOutputFormat.java:98)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:558)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:85)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:106)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigMapOnly$Map.collect(PigMap
2015-06-02 23:06:03,934 [main] ERROR org.apache.pig.tools.pigstats.PigStatsUtil - 1 map reduce job(s) failed!  
2015-06-02 23:06:03,934 [main] INFO  org.apache.pig.tools.pigstats.SimplePigStats - Script Statistics: 

看起来这是一个错误:https://issues.apache.org/jira/browse/PIG-3358

如果可以,请尝试更新到 pig 0.14,根据评论,此问题已修复。