当我跟随通过flink流写入数据时,出现“不可序列化”错误。我使用flink1.6、Elastic-Search-6.4和flink-connector-Elastic ticsearch6。at org.apache.flink.util.Preconditions.checkArgument(Preconditions.java:139)
at org.apache.flink.streaming.connectors.elasticsearch.ElasticsearchSink
我正在尝试向Kubernates中部署的ververica平台提交作业,但我收到了下面的消息,我向Flink standalone提交了相同的代码,它的工作正常!!我使用的是Flink 1.10.1,代码是Scala 2.12。The following properties are requested:
connector.password=****** connect
我想使用flink sql client创建一个配置单元表。我可以成功创建表t2,但是当我查询t2时,它会报错 Table options do not contain an option key 'connector' for discovering a connectorFlink SQL> use testdb1;
[INFO] Table has been creat
/blob/83a6400e2587b067d08a64bc7e10edd4b57e71b4/flink-connectors/flink-connector-filesystem/src/main/java(HadoopFileSystem.java:215)
at org.apache.flink.connector.file.sink.FileSink$RowFormatBuilder.createBucketWriter
of type org.apache.kafka.clients.consumer.OffsetResetStrategy in instance of org.apache.flink.connector.kafka.source.enumerator.initializer.ReaderHandledOffsetsInitializer:flink-connector-kafka:${flinkVersion}"
// https://mvnrepositor