centos7安装DataX后运行自检脚本报错

您所在的位置:网站首页 Pkcs10不能为空 centos7安装DataX后运行自检脚本报错

centos7安装DataX后运行自检脚本报错

2023-05-05 14:56| 来源: 网络整理| 查看: 265

[root@db9 opt]# python /opt/datax/bin/datax.py /opt/datax/job/job.json

DataX (DATAX-OPENSOURCE-3.0), From Alibaba !Copyright (C) 2010-2017, Alibaba Group. All Rights Reserved.

2022-10-16 22:00:10.932 [main] INFO MessageSource - JVM TimeZone: GMT+08:00, Locale: zh_CN2022-10-16 22:00:10.935 [main] INFO MessageSource - use Locale: zh_CN timeZone: sun.util.calendar.ZoneInfo[id="GMT+08:00",offset=28800000,dstSavings=0,useDaylight=false,transitions=0,lastRule=null]2022-10-16 22:00:10.953 [main] INFO VMInfo - VMInfo# operatingSystem class => sun.management.OperatingSystemImpl2022-10-16 22:00:10.960 [main] INFO Engine - the machine info =>

osInfo: Oracle Corporation 1.8 25.121-b13 jvmInfo: Linux amd64 3.10.0-1160.el7.x86_64 cpu num: 4

totalPhysicalMemory: -0.00G freePhysicalMemory: -0.00G maxFileDescriptorCount: -1 currentOpenFileDescriptorCount: -1

GC Names [PS MarkSweep, PS Scavenge]

MEMORY_NAME | allocation_size | init_size PS Eden Space | 256.00MB | 256.00MB Code Cache | 240.00MB | 2.44MB Compressed Class Space | 1,024.00MB | 0.00MB PS Survivor Space | 42.50MB | 42.50MB PS Old Gen | 683.00MB | 683.00MB Metaspace | -0.00MB | 0.00MB

2022-10-16 22:00:10.980 [main] INFO Engine - { "content":[ { "reader":{ "name":"streamreader", "parameter":{ "column":[ { "type":"string", "value":"DataX" }, { "type":"long", "value":19890604 }, { "type":"date", "value":"1989-06-04 00:00:00" }, { "type":"bool", "value":true }, { "type":"bytes", "value":"test" } ], "sliceRecordCount":100000 } }, "writer":{ "name":"streamwriter", "parameter":{ "encoding":"UTF-8", "print":false } } } ], "setting":{ "errorLimit":{ "percentage":0.02, "record":0 }, "speed":{ "byte":10485760 } }}

2022-10-16 22:00:11.003 [main] WARN Engine - prioriy set to 0, because NumberFormatException, the value is: null2022-10-16 22:00:11.005 [main] INFO PerfTrace - PerfTrace traceId=job_-1, isEnable=false, priority=02022-10-16 22:00:11.006 [main] INFO JobContainer - DataX jobContainer starts job.2022-10-16 22:00:11.008 [main] INFO JobContainer - Set jobId = 02022-10-16 22:00:11.024 [job-0] INFO JobContainer - jobContainer starts to do prepare ...2022-10-16 22:00:11.025 [job-0] INFO JobContainer - DataX Reader.Job [streamreader] do prepare work .2022-10-16 22:00:11.026 [job-0] INFO JobContainer - DataX Writer.Job [streamwriter] do prepare work .2022-10-16 22:00:11.026 [job-0] INFO JobContainer - jobContainer starts to do split ...2022-10-16 22:00:11.026 [job-0] INFO JobContainer - Job set Max-Byte-Speed to 10485760 bytes.2022-10-16 22:00:11.027 [job-0] INFO JobContainer - DataX Reader.Job [streamreader] splits to [8] tasks.2022-10-16 22:00:11.027 [job-0] INFO JobContainer - DataX Writer.Job [streamwriter] splits to [8] tasks.2022-10-16 22:00:11.056 [job-0] INFO JobContainer - jobContainer starts to do schedule ...2022-10-16 22:00:11.069 [job-0] INFO JobContainer - Scheduler starts [2] taskGroups.2022-10-16 22:00:11.072 [job-0] INFO JobContainer - Running by standalone Mode.2022-10-16 22:00:11.088 [taskGroup-1] INFO TaskGroupContainer - taskGroupId=[1] start [4] channels for [4] tasks.2022-10-16 22:00:11.088 [taskGroup-0] INFO TaskGroupContainer - taskGroupId=[0] start [4] channels for [4] tasks.2022-10-16 22:00:11.093 [taskGroup-0] INFO Channel - Channel set byte_speed_limit to 1231412.2022-10-16 22:00:11.094 [taskGroup-0] INFO Channel - Channel set record_speed_limit to -1, No tps activated.2022-10-16 22:00:11.113 [taskGroup-1] INFO TaskGroupContainer - taskGroup[1] taskId[2] attemptCount[1] is started2022-10-16 22:00:11.117 [taskGroup-1] INFO TaskGroupContainer - taskGroup[1] taskId[1] attemptCount[1] is started2022-10-16 22:00:11.125 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[6] attemptCount[1] is started2022-10-16 22:00:11.136 [taskGroup-1] INFO TaskGroupContainer - taskGroup[1] taskId[4] attemptCount[1] is started2022-10-16 22:00:11.145 [taskGroup-1] INFO TaskGroupContainer - taskGroup[1] taskId[7] attemptCount[1] is started2022-10-16 22:00:11.157 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[0] attemptCount[1] is started2022-10-16 22:00:11.165 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[3] attemptCount[1] is started2022-10-16 22:00:11.193 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[5] attemptCount[1] is started2022-10-16 22:00:11.995 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[0] is successed, used[850]ms2022-10-16 22:00:12.296 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[6] is successed, used[1186]ms2022-10-16 22:00:12.449 [taskGroup-1] INFO TaskGroupContainer - taskGroup[1] taskId[4] is successed, used[1323]ms2022-10-16 22:00:12.449 [taskGroup-1] INFO TaskGroupContainer - taskGroup[1] taskId[7] is successed, used[1311]ms2022-10-16 22:00:13.051 [taskGroup-1] INFO TaskGroupContainer - taskGroup[1] taskId[2] is successed, used[1941]ms2022-10-16 22:00:13.098 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[5] is successed, used[1907]ms2022-10-16 22:00:13.252 [taskGroup-1] INFO TaskGroupContainer - taskGroup[1] taskId[1] is successed, used[2137]ms2022-10-16 22:00:13.253 [taskGroup-1] INFO TaskGroupContainer - taskGroup[1] completed it's tasks.2022-10-16 22:00:13.299 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[3] is successed, used[2139]ms2022-10-16 22:00:13.300 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] completed it's tasks.2022-10-16 22:00:21.096 [job-0] INFO StandAloneJobContainerCommunicator - Total 800000 records, 20800000 bytes | Speed 1.98MB/s, 80000 records/s | Error 0 records, 0 bytes | All Task WaitWriterTime 0.367s | All Task WaitReaderTime 11.500s | Percentage 100.00%2022-10-16 22:00:21.096 [job-0] INFO AbstractScheduler - Scheduler accomplished all tasks.2022-10-16 22:00:21.096 [job-0] INFO JobContainer - DataX Writer.Job [streamwriter] do post work.2022-10-16 22:00:21.097 [job-0] INFO JobContainer - DataX Reader.Job [streamreader] do post work.2022-10-16 22:00:21.097 [job-0] INFO JobContainer - DataX jobId [0] completed successfully.2022-10-16 22:00:21.098 [job-0] INFO HookInvoker - No hook invoked, because base dir not exists or is a file: /opt/datax/hook2022-10-16 22:00:21.100 [job-0] INFO JobContainer - [total cpu info] => averageCpu | maxDeltaCpu | minDeltaCpu -1.00% | -1.00% | -1.00%

[total gc info] => NAME | totalGCCount | maxDeltaGCCount | minDeltaGCCount | totalGCTime | maxDeltaGCTime | minDeltaGCTime PS MarkSweep | 0 | 0 | 0 | 0.000s | 0.000s | 0.000s PS Scavenge | 0 | 0 | 0 | 0.000s | 0.000s | 0.000s

2022-10-16 22:00:21.100 [job-0] INFO JobContainer - PerfTrace not enable!2022-10-16 22:00:21.101 [job-0] INFO StandAloneJobContainerCommunicator - Total 800000 records, 20800000 bytes | Speed 1.98MB/s, 80000 records/s | Error 0 records, 0 bytes | All Task WaitWriterTime 0.367s | All Task WaitReaderTime 11.500s | Percentage 100.00%2022-10-16 22:00:21.102 [job-0] INFO JobContainer - 任务启动时刻 : 2022-10-16 22:00:11任务结束时刻 : 2022-10-16 22:00:21任务总计耗时 : 10s任务平均流量 : 1.98MB/s记录写入速度 : 80000rec/s读出记录总数 : 800000读写失败总数 : 0

[root@db9 opt]#



【本文地址】


今日新闻


推荐新闻


CopyRight 2018-2019 办公设备维修网 版权所有 豫ICP备15022753号-3