完整报错如下:

##########################################################################################################################################################

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flink/table/factories/StreamTableSourceFactoryat java.lang.ClassLoader.defineClass1(Native Method)at java.lang.ClassLoader.defineClass(ClassLoader.java:756)at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)at java.net.URLClassLoader.access$100(URLClassLoader.java:74)at java.net.URLClassLoader$1.run(URLClassLoader.java:369)at java.net.URLClassLoader$1.run(URLClassLoader.java:363)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:362)at java.lang.ClassLoader.loadClass(ClassLoader.java:418)at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)at java.lang.ClassLoader.loadClass(ClassLoader.java:351)at java.lang.ClassLoader.defineClass1(Native Method)at java.lang.ClassLoader.defineClass(ClassLoader.java:756)at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)at java.net.URLClassLoader.access$100(URLClassLoader.java:74)at java.net.URLClassLoader$1.run(URLClassLoader.java:369)at java.net.URLClassLoader$1.run(URLClassLoader.java:363)at java.security.AccessController.doPrivileged(Native Method)at java.net.URLClassLoader.findClass(URLClassLoader.java:362)at java.lang.ClassLoader.loadClass(ClassLoader.java:418)at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)at java.lang.ClassLoader.loadClass(ClassLoader.java:351)at java.lang.Class.forName0(Native Method)at java.lang.Class.forName(Class.java:348)at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)at java.util.ServiceLoader$1.next(ServiceLoader.java:480)at java.util.Iterator.forEachRemaining(Iterator.java:116)at org.apache.flink.table.factories.TableFactoryService.discoverFactories(TableFactoryService.java:214)at org.apache.flink.table.factories.TableFactoryService.findAllInternal(TableFactoryService.java:170)at org.apache.flink.table.factories.TableFactoryService.findAll(TableFactoryService.java:125)at org.apache.flink.table.factories.ComponentFactoryService.find(ComponentFactoryService.java:48)at org.apache.flink.table.api.bridge.scala.internal.StreamTableEnvironmentImpl$.lookupExecutor(StreamTableEnvironmentImpl.scala:315)at org.apache.flink.table.api.bridge.scala.internal.StreamTableEnvironmentImpl$.create(StreamTableEnvironmentImpl.scala:285)at org.apache.flink.table.api.bridge.scala.StreamTableEnvironment$.create(StreamTableEnvironment.scala:476)at FlinkKafkaDDLDemo$.main(FlinkKafkaDDLDemo.scala:27)at FlinkKafkaDDLDemo.main(FlinkKafkaDDLDemo.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.table.factories.StreamTableSourceFactoryat java.net.URLClassLoader.findClass(URLClassLoader.java:382)at java.lang.ClassLoader.loadClass(ClassLoader.java:418)at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)at java.lang.ClassLoader.loadClass(ClassLoader.java:351)... 39 more
加入以下依赖:
<dependency><groupId>org.apache.flink</groupId><artifactId>flink-table-planner_2.12</artifactId><version>1.12.0</version><scope>provided</scope>
</dependency>

##########################################################################################################################################################

得到:

Exception in thread "main" org.apache.flink.table.api.TableException: Could not instantiate the executor. Make sure a planner module is on the classpathat org.apache.flink.table.api.bridge.scala.internal.StreamTableEnvironmentImpl$.lookupExecutor(StreamTableEnvironmentImpl.scala:331)at org.apache.flink.table.api.bridge.scala.internal.StreamTableEnvironmentImpl$.create(StreamTableEnvironmentImpl.scala:285)at org.apache.flink.table.api.bridge.scala.StreamTableEnvironment$.create(StreamTableEnvironment.scala:476)at FlinkKafkaDDLDemo$.main(FlinkKafkaDDLDemo.scala:26)at FlinkKafkaDDLDemo.main(FlinkKafkaDDLDemo.scala)
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.delegation.ExecutorFactory' in
the classpath.Reason: No factory supports the additional filters.The following properties are requested:
class-name=org.apache.flink.table.planner.delegation.BlinkExecutorFactory
streaming-mode=trueThe following factories have been considered:
org.apache.flink.table.executor.StreamExecutorFactoryat org.apache.flink.table.factories.ComponentFactoryService.find(ComponentFactoryService.java:71)at org.apache.flink.table.api.bridge.scala.internal.StreamTableEnvironmentImpl$.lookupExecutor(StreamTableEnvironmentImpl.scala:315)... 4 more

加入以下依赖:

<dependency>
<groupId>org.apache.flink</groupId>
<artifactId>flink-table-planner-blink_2.12</artifactId>
<version>1.12.0</version>
</dependency>

##########################################################################################################################################################

继续调试碰到问题:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;at org.apache.flink.table.planner.delegation.PlannerBase.<init>(PlannerBase.scala:118)at org.apache.flink.table.planner.delegation.StreamPlanner.<init>(StreamPlanner.scala:47)at org.apache.flink.table.planner.delegation.BlinkPlannerFactory.create(BlinkPlannerFactory.java:50)at org.apache.flink.table.api.bridge.scala.internal.StreamTableEnvironmentImpl$.create(StreamTableEnvironmentImpl.scala:294)at org.apache.flink.table.api.bridge.scala.StreamTableEnvironment$.create(StreamTableEnvironment.scala:476)at org.apache.flink.table.api.bridge.scala.StreamTableEnvironment$.create(StreamTableEnvironment.scala:448)at FlinkKafkaDDLDemo$.main(FlinkKafkaDDLDemo.scala:28)at FlinkKafkaDDLDemo.main(FlinkKafkaDDLDemo.scala)

project structure中sdk改为scala2.12.8

##########################################################################################################################################################

继续调试碰到问题:

Exception in thread "main" org.apache.flink.table.api.TableException: findAndCreateTableSource failed.at org.apache.flink.table.factories.TableFactoryUtil.findAndCreateTableSource(TableFactoryUtil.java:49)at org.apache.flink.table.planner.plan.schema.LegacyCatalogSourceTable.findAndCreateLegacyTableSource(LegacyCatalogSourceTable.scala:193)at org.apache.flink.table.planner.plan.schema.LegacyCatalogSourceTable.toRel(LegacyCatalogSourceTable.scala:96)at org.apache.calcite.sql2rel.SqlToRelConverter.toRel(SqlToRelConverter.java:3585)at org.apache.calcite.sql2rel.SqlToRelConverter.convertIdentifier(SqlToRelConverter.java:2507)at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2144)at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2093)at org.apache.calcite.sql2rel.SqlToRelConverter.convertFrom(SqlToRelConverter.java:2050)at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelectImpl(SqlToRelConverter.java:663)at org.apache.calcite.sql2rel.SqlToRelConverter.convertSelect(SqlToRelConverter.java:644)at org.apache.calcite.sql2rel.SqlToRelConverter.convertQueryRecursive(SqlToRelConverter.java:3438)at org.apache.calcite.sql2rel.SqlToRelConverter.convertQuery(SqlToRelConverter.java:570)at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.org$apache$flink$table$planner$calcite$FlinkPlannerImpl$$rel(FlinkPlannerImpl.scala:165)at org.apache.flink.table.planner.calcite.FlinkPlannerImpl.rel(FlinkPlannerImpl.scala:157)at org.apache.flink.table.planner.operations.SqlToOperationConverter.toQueryOperation(SqlToOperationConverter.java:823)at org.apache.flink.table.planner.operations.SqlToOperationConverter.convertSqlQuery(SqlToOperationConverter.java:795)at org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:250)at org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:78)at org.apache.flink.table.api.internal.TableEnvironmentImpl.sqlQuery(TableEnvironmentImpl.java:639)at FlinkKafkaDDLDemo$.main(FlinkKafkaDDLDemo.scala:86)at FlinkKafkaDDLDemo.main(FlinkKafkaDDLDemo.scala)
Caused by: org.apache.flink.table.api.NoMatchingTableFactoryException: Could not find a suitable table factory for 'org.apache.flink.table.factories.TableSourceFactory' in
the classpath.Reason: Required context properties mismatch.The matching candidates:
org.apache.flink.table.sources.CsvAppendTableSourceFactory
Mismatched properties:
'connector.type' expects 'filesystem', but is 'kafka'
'format.type' expects 'csv', but is 'json'The following properties are requested:
connector.properties.0.key=zookeeper.connect
connector.properties.0.value=Desktop:2181
connector.properties.1.key=bootstrap.servers
connector.properties.1.value=Desktop:9091
connector.startup-mode=earliest-offset
connector.topic=kafka_ddl
connector.type=kafka
connector.version=2.5.0
format.derive-schema=true
format.type=json
schema.0.data-type=VARCHAR(2147483647)
schema.0.name=name
schema.1.data-type=VARCHAR(2147483647)
schema.1.name=age
schema.2.data-type=VARCHAR(2147483647)
schema.2.name=city
schema.3.data-type=VARCHAR(2147483647)
schema.3.name=address
schema.4.data-type=TIMESTAMP(6)
schema.4.name=ts
update-mode=appendThe following factories have been considered:
org.apache.flink.table.sources.CsvBatchTableSourceFactory
org.apache.flink.table.sources.CsvAppendTableSourceFactory
org.apache.flink.streaming.connectors.kafka.KafkaTableSourceSinkFactoryat org.apache.flink.table.factories.TableFactoryService.filterByContext(TableFactoryService.java:322)at org.apache.flink.table.factories.TableFactoryService.filter(TableFactoryService.java:190)at org.apache.flink.table.factories.TableFactoryService.findSingleInternal(TableFactoryService.java:143)at org.apache.flink.table.factories.TableFactoryService.find(TableFactoryService.java:96)at org.apache.flink.table.factories.TableFactoryUtil.findAndCreateTableSource(TableFactoryUtil.java:46)... 20 moreProcess finished with exit code 1

依赖中加入:

<dependency>
            <groupId>org.apache.flink</groupId>
            <artifactId>flink-connector-kafka_2.12</artifactId>
            <version>1.12.0</version>
        </dependency>

目前的代码是:

https://gitee.com/appleyuchi/Flink_Code/blob/master/flink读kafka/Scala/src/main/scala/FlinkKafkaDDLDemo.scala

‘connector.type‘ expects ‘filesystem‘, but is ‘kafka‘相关推荐

  1. Could not find a suitable table factory for ‘org.apache.flink.table.factories.TableSourceFactory‘

    启动FLINK SQL Client的时候 $FLINK_HOME/bin/sql-client.sh embedded -d $FLINK_HOME/conf/sql.my.yaml -l /hom ...

  2. kafka connector使用(Docker一键启动版)

    前言 结合博主之前写的一篇<kafka connector使用(单机手动启动版)>一起看 版本 kafka: confluentinc/cp-kafka:7.0.1 zookeeper: ...

  3. Confluent之Kafka Connector初体验

    概述 背景 Apache Kafka 是最大.最成功的开源项目之一,可以说是无人不知无人不晓,在前面的文章<Apache Kafka分布式流处理平台及大厂面试宝典>我们也充分认识了Kafk ...

  4. imx6q调试android7问题Unrecognized filesystem type Bad Linux ARM zImage magic!

    在freescale imx6q板上移植android7到时遇到一个很麻烦的问题: 用mfgtools工具下载android镜像后,启动日志: U-Boot 2015.04-dirty (Aug 15 ...

  5. 使用E-MapReduce服务将Kafka数据导入OSS

    概述 kafka是一个开源社区常用的消息队列,虽然kafka官方(Confluent公司)提供插件从Kafka直接导入数据到HDFS的connector,但对阿里云对文件存储系统OSS却没有官方的支持 ...

  6. 1.3 Quick Start中 Step 7: Use Kafka Connect to import/export data官网剖析(博主推荐)

    不多说,直接上干货! 一切来源于官网 http://kafka.apache.org/documentation/ Step 7: Use Kafka Connect to import/export ...

  7. Flink SQL Client读Kafka+流计算(DDL方式+代碼嵌入DDL/SQL方式)

    #################################################################################################### ...

  8. flink DDL读取kafka数据-Scala嵌入DDL形式

    步驟: service firewalld stop(关闭防火墙) 啓動hadoop 離開安全模式 啓動zookeeper與kafka集羣 操作 命令 备注 查看topic $KAFKA/bin/ka ...

  9. flink sql client讀取kafka數據的timestamp(DDL方式)

    实验目的 Kafka的数据能让Flink SQL Client读取到 本文是对[1]的详细记载 具体操作步骤 ①啓動hadoop集羣,離開安全模式 ②各个节点都关闭防火墙: service firew ...

最新文章

  1. 我开发的代码,如何申请版权_代码简介:我花了3个月时间申请开发人员职位。 这是我学到的。...
  2. 给大家推荐一个SQL好的站点
  3. 20165301 我期望的师生关系
  4. 数据结构源码笔记(C语言):顺序查找
  5. 【KVM系列03】KVM的I/O 全虚拟化和准虚拟化
  6. 华为笔记本matebook13_华为MateBook 13_HUAWEI MateBook 13(i5 8265U/8GB/256GB/独显)_笔记本导购-中关村在线...
  7. tomcat各目录(文件)作用
  8. zend studio
  9. pthread 简要使用指南
  10. 小米3Android密码怎么解吗,小米路由器3管理密码忘记了怎么办?
  11. Vultr VPS修改root密码的方法
  12. 新能源车如何走出“一票否决”的窠臼?
  13. 网络工程师_记录的一些真题_2017上半年上午
  14. Python每日一练——第5天:闰年问题升级版
  15. 性能分析之系统资源饱和度
  16. 流利阅读 2019.1.21 Top S. Korean animal rights group slammed for destroying dogs
  17. Apache Pulsar PMC 成员翟佳:开源和 Apache 社区是个带有魔法的宝库
  18. 新浪sae部署微信订阅号
  19. 【云原生】学习K8s,读完这篇就够了
  20. 泛微OA在公立三甲医院:推动建立完善财务预算数字化管理体系

热门文章

  1. android 音频压缩 silk,微信音頻silk導出多個mp3,合並成一個mp3,壓縮大小
  2. [JZOJ]2109 清兵线 题解
  3. 客户关系管理CRM系统源码PHP开源软件源码
  4. 恐怕你确定自己喜欢做什么
  5. 云和恩墨多个标杆案例入选《2021年信创产业发展报告》,将持续推动数据库产业进步...
  6. Zlib 1.2.11 Windows 编译
  7. DCC - Photoshop - Nvidia NormalMapFilter - 法线生成工具 - 顺便测试 Unity URP 12.1 中的 Decal System
  8. 照片调色系列教程(一):打造格调美女
  9. 【秋招面试】货拉拉面试(1面过程)
  10. Python 创建加密压缩文件