elastic-job 的script模式在macos下无法正常

2019-06-17  本文已影响0人  捞月亮的阿汤哥

问题描述

今天在学习elastic job的script模式的时候,遇到了一个问题,就是在练习官网的demo的时候,写了相应的脚本,理论上应该会输出"echo sharding execution context is xxx",但是控制台无法输出,问题如下:

package com.example.script;

import com.dangdang.ddframe.job.config.JobCoreConfiguration;
import com.dangdang.ddframe.job.config.dataflow.DataflowJobConfiguration;
import com.dangdang.ddframe.job.config.script.ScriptJobConfiguration;
import com.dangdang.ddframe.job.lite.api.JobScheduler;
import com.dangdang.ddframe.job.lite.config.LiteJobConfiguration;
import com.dangdang.ddframe.job.reg.base.CoordinatorRegistryCenter;
import com.dangdang.ddframe.job.reg.zookeeper.ZookeeperConfiguration;
import com.dangdang.ddframe.job.reg.zookeeper.ZookeeperRegistryCenter;

/**
 * 注意 29行newBuilder的jobName需要和31行的jobClass相同,不然会报错
 * 提供了动态分片的功能,原先启动一个实例A,A会运行shard0和shard1的任务
 * 然后启动实例B,实例B启动了shard1任务,A只运行shard0任务
 * 然后关掉实例B,A又会分配两个shard任务执行
 * <p>
 * override很重要,不设置true不会覆盖上次设置的参数
 */
public class JobDemo {
    public static void main(String[] args) {
        new JobScheduler(createRegistryCenter(), createJobConfiguration()).init();
    }

    private static CoordinatorRegistryCenter createRegistryCenter() {
        CoordinatorRegistryCenter regCenter = new ZookeeperRegistryCenter(new ZookeeperConfiguration("localhost:2181", "elastic-job-demo"));
        regCenter.init();
        return regCenter;
    }

    private static LiteJobConfiguration createJobConfiguration() {
        // 创建作业配置
        // ...
        // 定义作业核心配置
        JobCoreConfiguration scriptCoreConfig = JobCoreConfiguration.newBuilder("myScriptElasticJob", "0/5 * * * * ?", 2).build();
        // 定义DataFlow类型配置
        ScriptJobConfiguration scriptJobConfig = new ScriptJobConfiguration(scriptCoreConfig, "/Users/zihao/Desktop/test.sh");
        // 定义Lite作业根配置
        LiteJobConfiguration scriptJobRootConfig = LiteJobConfiguration.newBuilder(scriptJobConfig).overwrite(true).build();

        return scriptJobRootConfig;
    }
}

对应的test.sh 如下:

#!/bin/bash
echo sharding execution context is $*

解决如下

查了网上的资料大部分都是cmd的脚本,然后我将上述程序的ScriptJobConfiguration scriptJobConfig = new ScriptJobConfiguration(scriptCoreConfig, "/Users/zihao/Desktop/test.sh");
改为

ScriptJobConfiguration scriptJobConfig = new ScriptJobConfiguration(scriptCoreConfig, "sh /Users/zihao/Desktop/test.sh");

然后问题得到了解决:

sharding execution context is {"jobName":"myScriptElasticJob","taskId":"myScriptElasticJob@-@0,1@-@READY@-@192.168.1.129@-@2058","shardingTotalCount":2,"jobParameter":"","shardingItem":0}
sharding execution context is {"jobName":"myScriptElasticJob","taskId":"myScriptElasticJob@-@0,1@-@READY@-@192.168.1.129@-@2058","shardingTotalCount":2,"jobParameter":"","shardingItem":1}
sharding execution context is {"jobName":"myScriptElasticJob","taskId":"myScriptElasticJob@-@0,1@-@READY@-@192.168.1.129@-@2058","shardingTotalCount":2,"jobParameter":"","shardingItem":1}
sharding execution context is {"jobName":"myScriptElasticJob","taskId":"myScriptElasticJob@-@0,1@-@READY@-@192.168.1.129@-@2058","shardingTotalCount":2,"jobParameter":"","shardingItem":0}

主要就是在脚本名称前加上了sh, 还有需要注意的一点是脚本名称需要写全路径。

上一篇下一篇

猜你喜欢

热点阅读