首页
学习
活动
专区
圈层
工具
发布
    • 综合排序
    • 最热优先
    • 最新优先
    时间不限
  • 来自专栏Spark学习技巧

    实时数仓|基于Flink1.11的SQL构建实时数仓探索实践

    ) WITH( 'connector' = 'kafka', 'topic' = 'mydw.base_province', 'properties.bootstrap.servers' = 'kms )WITH( 'connector' = 'kafka', 'topic' = 'mydw.base_category1', 'properties.bootstrap.servers' = 'kms )WITH( 'connector' = 'kafka', 'topic' = 'mydw.base_category2', 'properties.bootstrap.servers' = 'kms )WITH( 'connector' = 'kafka', 'topic' = 'mydw.base_category3', 'properties.bootstrap.servers' = 'kms ) WITH( 'connector' = 'kafka', 'topic' = 'mydw.order_detail', 'properties.bootstrap.servers' = 'kms

    2.1K30发布于 2020-09-08
  • 来自专栏Spark学习技巧

    基于Canal与Flink实现数据实时增量同步(一)

    topic名称 canal.mq.topic=test 修改conf/canal.properties,修改内容如下: # 配置zookeeper地址 canal.zkServers =kms-2:2181,kms # 可选项: tcp(默认), kafka, RocketMQ, canal.serverMode = kafka # 配置kafka地址 canal.mq.servers = kms-2:9092,kms 启动kafka控制台消费者测试 bin/kafka-console-consumer.sh --bootstrap-server kms-2:9092,kms-3:9092,kms-4:9092 --

    2.8K20发布于 2020-09-08
  • 来自专栏五分钟学大数据

    50000字,数仓建设保姆级教程,离线和实时一网打尽(理论+实战) 下

    ) WITH( 'connector' = 'kafka', 'topic' = 'mydw.base_province', 'properties.bootstrap.servers' = 'kms )WITH( 'connector' = 'kafka', 'topic' = 'mydw.base_category1', 'properties.bootstrap.servers' = 'kms )WITH( 'connector' = 'kafka', 'topic' = 'mydw.base_category2', 'properties.bootstrap.servers' = 'kms )WITH( 'connector' = 'kafka', 'topic' = 'mydw.base_category3', 'properties.bootstrap.servers' = 'kms ) WITH( 'connector' = 'kafka', 'topic' = 'mydw.order_detail', 'properties.bootstrap.servers' = 'kms

    3.3K65编辑于 2022-04-07
  • 来自专栏Spark学习技巧

    基于Canal与Flink实现数据实时增量同步(二)

    Properties props = new Properties(); props.setProperty("bootstrap.servers", "kms-2:9092,kms ; // only required for Kafka 0.8 props.setProperty("zookeeper.connect", "kms-2:2181,kms

    2.2K20发布于 2020-09-08
  • 来自专栏大数据解决方案

    Flink on Hive构建流批一体数仓

    偏移量 'properties.group.id' = 'group1', -- 消费者组 'properties.bootstrap.servers' = 'kms-2:9092,kms 偏移量 'properties.group.id' = 'group1', -- 消费者组 'properties.bootstrap.servers' = 'kms-2:9092,kms

    5.1K42发布于 2021-02-04
  • 来自专栏Spark学习技巧

    项目实践|基于Flink的用户行为日志分析系统

    sink a1.sinks.sink1.type = org.apache.flume.sink.kafka.KafkaSink a1.sinks.sink1.brokerList=kms-2:9092,kms props = new Properties(); // kafka broker地址 props.put("bootstrap.servers", "kms-2:9092,kms

    2.6K31发布于 2020-09-08
领券