Hive 之 UDAF

2020-08-17  本文已影响0人  xiaoc024

1. Background

一句话概括 UDAF 的背景就是系统自带的聚合函数无法满足用户需求。

2. Basic

2.1 什么是 UDAF ?

UDAF 即自定义聚合函数。首先看一下什么是聚合函数:聚合函数即是指0行到多行的0个到多个列作为参数输入,返回单一值的函数。通俗的说就是多进一出。经常和 group by 子句一起用。比如 sum,count,avg 等都是很常见的系统聚合函数。

2.2 什么是 ObjectInspector ?

这个概念较为复杂,感兴趣的可以自己深入了解一下。简单的说,ObjectInspector 接口使得Hive可以不拘泥于一种特定数据格式,使得数据流 1)在输入端和输出端切换不同的输入/输出格式 2)在不同的Operator上使用不同的数据格式。 hive 的 UDAF 中只要会使用即可。

3. Deep

3.1 两个核心类

3.2 GenericUDAFEvaluator

7个函数:

4个步骤:

4. Best Practice

4.1 需求:

input:
a1:5 c1:10
a2:3 c2:40
a3:8 c3:100
output:
(a1 * c1 + a2 * c2 + a3 * c3) / (c1 + c2 + c3) 类似于加权平均数

4.2 代码:

public class WeightedAverage extends AbstractGenericUDAFResolver {
    @Override
    public GenericUDAFEvaluator getEvaluator(TypeInfo[] parameters)
            throws SemanticException {
        if (parameters.length != 2) {
            throw new UDFArgumentTypeException(parameters.length - 1,
                    "Exactly two argument is expected.");
        }

        if (parameters[0].getCategory() != ObjectInspector.Category.PRIMITIVE) {
            throw new UDFArgumentTypeException(0,
                    "Only primitive type arguments are accepted but "
                            + parameters[0].getTypeName() + " is passed.");
        }
        switch (((PrimitiveTypeInfo) parameters[0]).getPrimitiveCategory()) {
            case INT:
            case LONG:
                return new GenericUDAFAverageEvaluator();
            case BYTE:
            case SHORT:
            case FLOAT:
            case DOUBLE:
            case STRING:
            case TIMESTAMP:
            case BOOLEAN:
            default:
                throw new UDFArgumentTypeException(0,
                        "Only int or long type arguments are accepted but "
                                + parameters[0].getTypeName() + " is passed.");
        }
    }


    public static class GenericUDAFAverageEvaluator extends GenericUDAFEvaluator {

        // input For iterate()
        PrimitiveObjectInspector avgOriginalInputOI;
        PrimitiveObjectInspector weightOriginalInputOI;

        // output For terminatePartial()
        Object[] partialAggregationResult;

        // input For merge()
        StructObjectInspector soi;
        StructField countField;
        StructField sumField;
        LongObjectInspector countFieldOI;
        LongObjectInspector sumFieldOI;

        // output For terminate()
        LongWritable fullAggregationResult;

        @Override
        public ObjectInspector init(Mode mode, ObjectInspector[] parameters)
                throws HiveException {
            super.init(mode, parameters);

    // init input
            // Mode.PARTIAL1 || mode == Mode.COMPLETE
            // input:original, method:iterate()
            if (mode == Mode.PARTIAL1 || mode == Mode.COMPLETE) {
                avgOriginalInputOI = (PrimitiveObjectInspector) parameters[0];
                weightOriginalInputOI = (PrimitiveObjectInspector) parameters[1];
            }
            // Mode.PARTIAL2 || Mode.FINAL
            // input:partial aggregation, method:merge()
            else {
                //部分数据作为输入参数时,用到的struct的OI实例,指定输入数据类型,用于解析数据
                soi = (StructObjectInspector) parameters[0];
                countField = soi.getStructFieldRef("count");
                sumField = soi.getStructFieldRef("sum");
                //数组中的每个数据,需要其各自的基本类型OI实例解析
                countFieldOI = (LongObjectInspector) countField.getFieldObjectInspector();
                sumFieldOI = (LongObjectInspector) sumField.getFieldObjectInspector();
            }

    // init output
            // Mode.PARTIAL1 || mode == Mode.PARTIAL2
            // output:partial aggregation, method:terminatePartial()
            if (mode == Mode.PARTIAL1 || mode == Mode.PARTIAL2) {
                partialAggregationResult = new Object[2];
                partialAggregationResult[0] = new LongWritable(0);
                partialAggregationResult[1] = new LongWritable(0);
                /*
                 * 构造Struct的OI实例,用于设定聚合结果数组的类型
                 * 需要字段名List和字段类型List作为参数来构造
                 */
                ArrayList<String> fname = new ArrayList<String>();
                fname.add("count");
                fname.add("sum");
                ArrayList<ObjectInspector> foi = new ArrayList<ObjectInspector>();
                //注:此处的两个OI类型 描述的是 partialResult[] 的两个类型,故需一致
                foi.add(PrimitiveObjectInspectorFactory.writableLongObjectInspector);
                foi.add(PrimitiveObjectInspectorFactory.writableLongObjectInspector);
                return ObjectInspectorFactory.getStandardStructObjectInspector(fname, foi);
            }
           // Mode.COMPLETE || Mode. FINAL
           // output:full aggregation, method:terminate()
            else {
                //FINAL COMPLETE 最终聚合结果为一个数值,并用基本类型OI设定其类型
                fullAggregationResult = new LongWritable(0);
                return PrimitiveObjectInspectorFactory.writableLongObjectInspector;
            }
        }

        /*
         * 聚合数据缓存存储结构
         */
        static class AverageAgg implements AggregationBuffer {
            long count;
            long sum;
        }

        @Override
        public AggregationBuffer getNewAggregationBuffer() throws HiveException {
            AverageAgg result = new AverageAgg();
            reset(result);
            return result;
        }

        @Override
        public void reset(AggregationBuffer agg) throws HiveException {
            AverageAgg myagg = (AverageAgg) agg;
            myagg.count = 0;
            myagg.sum = 0;
        }

        /*
         * 遍历原始数据(将一行数据(Object[] parameters)放入聚合buffer中)
         * input: original
         */
        @Override
        public void iterate(AggregationBuffer agg, Object[] parameters)
                throws HiveException {
            Object p1 = parameters[0];
            Object p2 = parameters[1];
            if (p1 != null && p2 != null) {
                AverageAgg myagg = (AverageAgg) agg;
                try {
                    long avg = PrimitiveObjectInspectorUtils.getLong(p1, avgOriginalInputOI);
                    long count = PrimitiveObjectInspectorUtils.getLong(p2, weightOriginalInputOI);
                    myagg.count += count;
                    myagg.sum += avg*count;
                } catch (NumberFormatException e) {
                    throw new HiveException("NumberFormatException: get value failed");
                }
            }
        }

        /*
         * 得出部分聚合结果
         * output: partial aggregation
         */
        @Override
        public Object terminatePartial(AggregationBuffer agg) throws HiveException {
            AverageAgg myagg = (AverageAgg) agg;
            ((LongWritable) partialAggregationResult[0]).set(myagg.count);
            ((LongWritable) partialAggregationResult[1]).set(myagg.sum);
            return partialAggregationResult;
        }

        /*
         * 合并部分聚合结果(注:Object[] 是 Object 的子类,此处 partial 为 Object[]数组)
         * input: partial aggregation
         */
        @Override
        public void merge(AggregationBuffer agg, Object partial)
                throws HiveException {
            if (partial != null) {
                AverageAgg myagg = (AverageAgg) agg;
                //通过StandardStructObjectInspector实例,分解出 partial 数组元素值
                Object partialCount = soi.getStructFieldData(partial, countField);
                Object partialSum = soi.getStructFieldData(partial, sumField);
                //通过基本数据类型的OI实例解析Object的值
                myagg.count += countFieldOI.get(partialCount);
                myagg.sum += sumFieldOI.get(partialSum);
            }
        }

        /*
         * 得出最终聚合结果
         * output: full aggregation
         */
        @Override
        public Object terminate(AggregationBuffer agg) throws HiveException {
            AverageAgg myagg = (AverageAgg) agg;
            if (myagg.count == 0) {
                return null;
            } else {
                fullAggregationResult.set(myagg.sum / myagg.count);
                return fullAggregationResult;
            }
        }
    }

}

5. Ref

  1. 《Hive 编程指南》
  2. Hive之自定义聚合函数UDAF
上一篇下一篇

猜你喜欢

热点阅读