首页 \ 问答 \ Solr日志没有被直接推送到kafka,Solr无法连接到ZK(Solr logs not being pushed directly to kafka, Solr cannot connect to ZK)

Solr日志没有被直接推送到kafka,Solr无法连接到ZK(Solr logs not being pushed directly to kafka, Solr cannot connect to ZK)

我正在尝试使用log4j将solr中的日志直接发送到kafka。 虽然日志将打印到stdout,但没有数据到达kafka。 我可以使用命令行生成器将数据推送到kafka。

我得到的警告和错误:

WARN  - 2015-01-19 12:09:25.545; org.apache.solr.cloud.Overseer$ClusterStateUpdater; Solr cannot talk to ZK, exiting Overseer main queue loop                                                                                                                             
INFO  - 2015-01-19 12:09:25.552; org.apache.solr.cloud.Overseer$ClusterStateUpdater; Overseer Loop exiting : 10.254.120.50:8900_solr 
WARN  - 2015-01-19 12:09:25.554; org.apache.solr.common.cloud.ZkStateReader$2; ZooKeeper watch triggered, but Solr cannot talk to ZK 
ERROR - 2015-01-19 12:09:25.560; org.apache.solr.cloud.Overseer$ClusterStateUpdater; could not read the data                         
org.apache.zookeeper.KeeperException$SessionExpiredException: KeeperErrorCode = Session expired for /overseer_elect/leader     

我的Log4j.Properties文件:

 solr.log=/home/solradmin/solr/latest/logs/                                                                   
    log4j.rootLogger=INFO, file, KAFKA                                                                                                   
    log4j.logger.KAFKA=INFO, file                                                                                                        
    log4j.logger.solr=INFO, KAFKA  

    log4j.appender.stdout=org.apache.log4j.ConsoleAppender                                                       
    log4j.appender.stdout.layout=org.apache.log4j.PatternLayout                                                  
    log4j.appender.stdout.layout.ConversionPattern=%5p [%t] (%F:%L) - %m%n 

    log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender                                                                       
    log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout                                                   
    log4j.appender.KAFKA.layout.ConversionPattern=%-5p: %c - %m%n                                                                        
    log4j.appender.KAFKA.BrokerList=localhost:9092                                                                                       
    log4j.appender.KAFKA.Topic=herpderp                                                                               

    log4j.appender.file=org.apache.log4j.RollingFileAppender                                                                             
    log4j.appender.file.MaxFileSize=100MB                                                                                                
    log4j.appender.file.MaxBackupIndex=9   
                                                                                                                                                                                                                                                                                                                log4j.appender.file.File=${solr.log}/solr.log                                                                                        
    log4j.appender.file.layout=org.apache.log4j.PatternLayout                                                                            
    log4j.appender.file.layout.ConversionPattern=%-5p - %d{yyyy-MM-dd HH:mm:ss.SSS}; %C; %m\n         
    log4j.logger.org.apache.solr=DEBUG                                                                                                   
    log4j.logger.org.apache.zookeeper=WARN                                                                                       
    log4j.logger.org.apache.hadoop=WARN    

log4j文档未将kafka列为受支持的appender。 然而, kafka文档显示 log4j易于配置。

log4j需要某种插件来支持kafka吗?

我使用以下来源尝试了不同的配置: http//kafka.apache.org/07/quickstart.htmlKafkLog4JAppender没有将应用程序日志推送到kafka主题


I am trying to send logs from solr directly to kafka using log4j. While the logs will be printed to stdout, no data arrives in kafka. I am able to push data to kafka with the command line producer.

The warning and error I am getting:

WARN  - 2015-01-19 12:09:25.545; org.apache.solr.cloud.Overseer$ClusterStateUpdater; Solr cannot talk to ZK, exiting Overseer main queue loop                                                                                                                             
INFO  - 2015-01-19 12:09:25.552; org.apache.solr.cloud.Overseer$ClusterStateUpdater; Overseer Loop exiting : 10.254.120.50:8900_solr 
WARN  - 2015-01-19 12:09:25.554; org.apache.solr.common.cloud.ZkStateReader$2; ZooKeeper watch triggered, but Solr cannot talk to ZK 
ERROR - 2015-01-19 12:09:25.560; org.apache.solr.cloud.Overseer$ClusterStateUpdater; could not read the data                         
org.apache.zookeeper.KeeperException$SessionExpiredException: KeeperErrorCode = Session expired for /overseer_elect/leader     

My Log4j.Properties file:

 solr.log=/home/solradmin/solr/latest/logs/                                                                   
    log4j.rootLogger=INFO, file, KAFKA                                                                                                   
    log4j.logger.KAFKA=INFO, file                                                                                                        
    log4j.logger.solr=INFO, KAFKA  

    log4j.appender.stdout=org.apache.log4j.ConsoleAppender                                                       
    log4j.appender.stdout.layout=org.apache.log4j.PatternLayout                                                  
    log4j.appender.stdout.layout.ConversionPattern=%5p [%t] (%F:%L) - %m%n 

    log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender                                                                       
    log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout                                                   
    log4j.appender.KAFKA.layout.ConversionPattern=%-5p: %c - %m%n                                                                        
    log4j.appender.KAFKA.BrokerList=localhost:9092                                                                                       
    log4j.appender.KAFKA.Topic=herpderp                                                                               

    log4j.appender.file=org.apache.log4j.RollingFileAppender                                                                             
    log4j.appender.file.MaxFileSize=100MB                                                                                                
    log4j.appender.file.MaxBackupIndex=9   
                                                                                                                                                                                                                                                                                                                log4j.appender.file.File=${solr.log}/solr.log                                                                                        
    log4j.appender.file.layout=org.apache.log4j.PatternLayout                                                                            
    log4j.appender.file.layout.ConversionPattern=%-5p - %d{yyyy-MM-dd HH:mm:ss.SSS}; %C; %m\n         
    log4j.logger.org.apache.solr=DEBUG                                                                                                   
    log4j.logger.org.apache.zookeeper=WARN                                                                                       
    log4j.logger.org.apache.hadoop=WARN    

The log4j documentation does not list kafka as a supported appender. Yet the kafka documentation shows that log4j is easy to configure.

Does log4j require some sort of plugin to support kafka?

I tried different configurations using the following sources: http://kafka.apache.org/07/quickstart.html and KafkLog4JAppender not pushing application logs to kafka topic .


原文:https://stackoverflow.com/questions/31572380
更新时间:2024-02-27 10:02

最满意答案

我设法运行它包括所需的库。 我可以更改pom并使用依赖项构建最终的jar文件,但我更喜欢不更改项目。

mvn clean install -DskipTests=true -Dmaven.javadoc.skip=true构建它之后mvn clean install -DskipTests=true -Dmaven.javadoc.skip=true

我用设置java classpath运行它:

java -cp tez-dist/target/tez-0.7.0/lib/*:tez-dist/target/tez-0.7.0/* org.apache.tez.examples.OrderedWordCount in.txt out

I managed to run it with including needed libraries. I could changing pom and build the final jar file with the dependencies, but I preferred not to change the project.

After building it with mvn clean install -DskipTests=true -Dmaven.javadoc.skip=true

I ran it with setting java classpath:

java -cp tez-dist/target/tez-0.7.0/lib/*:tez-dist/target/tez-0.7.0/* org.apache.tez.examples.OrderedWordCount in.txt out

相关问答

更多
  • 即时的。 从性能基准角度来看,您只关心在查询完成之前需要多长时间(人工时间),并且您可以查看结果,而不是内部应用程序正在启动的进程数量。 请注意,我会非常小心地进行性能基准测试,因为Spark和Hive都有大量的可调配置旋钮,这些旋钮极大地影响了性能。 请参阅此处的几个示例,以便通过矢量化,数据格式选择,数据分段和数据排序来改变Hive的性能。 “普遍共识”是Spark在Tez上比Hive更快,但是Hive可以处理大量不适合内存的数据集。 (因为我很懒,所以我不会引用一个消息来源,做一些Google搜索) ...
  • 我解决了自己....我刚刚将tez的版本从0.4.1升级到0.5.2。 我刚才听说tez 0.4.1在与hive集成时遇到了一些问题。 i solved myself.... i just upgraded the version of tez from 0.4.1 to 0.5.2. i just heard that tez 0.4.1 had some problems when integrating with hive.
  • 我能够完成这件事。 早些时候,当我这样做时,它只是地图工作。 现在,我已经更改了一些查询以使用reducer(添加分发)。 然后,如果我说“减速器数量= 1”,它的工作原理。 但它不适用于其他适用于仅限地图工作的参数 I was able to get this done. Earlier, when I was doing this, it was map only job. Now, I have changed the query a bit to use reducer also(Added dis ...
  • 由于Tez是一个孵化器项目,我们需要下载src并使用maven构建。 Tez 0.8.1 alpha版本的详细步骤可参考以下链接。 https://acadgild.com/blog/integrating-apache-tez-with-hadoop/ 虽然构建它将在tez-ui失败。为了不面对这个问题,在你的linux机器上安装git,node js和npm,然后开始构建,这将允许你成功构建。 除了上面提到的步骤之外,还需要在〜/ .bashrc文件中添加参数 export TEZ_CONF_DIR= ...
  • 在tez-site.xml文件中,配置tez.lib.uris属性以包含所需jar的路径。 In the tez-site.xml file, configure the tez.lib.uris property to include path to required jar.
  • 在hive控制台日志中查找applicationId。 然后可以通过以下方式获取日志: $ bin / yarn logs -applicationId> app_logs.txt Look for an applicationId in the hive console log. Logs can then be obtained by: $bin/yarn logs -applicationId > app_logs.txt
  • 我设法运行它包括所需的库。 我可以更改pom并使用依赖项构建最终的jar文件,但我更喜欢不更改项目。 用mvn clean install -DskipTests=true -Dmaven.javadoc.skip=true构建它之后mvn clean install -DskipTests=true -Dmaven.javadoc.skip=true 我用设置java classpath运行它: java -cp tez-dist/target/tez-0.7.0/lib/*:tez-dist/targe ...
  • 似乎是这个版本(0.8.0)连接到资源管理器的问题。 我编译并集成了之前的稳定版本(0.7.0),现在一切都很好。 我希望他们能解决问题。 It seems that it was the problem of this version (0.8.0) connecting to the resource manager. I compiled and integrated the previous stable release (0.7.0) and everything is good to go no ...
  • 是的,很久以前就被删除了。 Yes, it was removed quite some time ago.
  • 根据错误,您似乎需要安装git: 在PATH中未安装ENOGIT git 此外,这可能对其他错误有所帮助: https : //cwiki.apache.org/confluence/display/TEZ/Build+errors+and+solutions Based on the error, it seems like you need git installed: ENOGIT git is not installed or not in the PATH Also, this might be ...

相关文章

更多

最新问答

更多
  • 您如何使用git diff文件,并将其应用于同一存储库的副本的本地分支?(How do you take a git diff file, and apply it to a local branch that is a copy of the same repository?)
  • 将长浮点值剪切为2个小数点并复制到字符数组(Cut Long Float Value to 2 decimal points and copy to Character Array)
  • OctoberCMS侧边栏不呈现(OctoberCMS Sidebar not rendering)
  • 页面加载后对象是否有资格进行垃圾回收?(Are objects eligible for garbage collection after the page loads?)
  • codeigniter中的语言不能按预期工作(language in codeigniter doesn' t work as expected)
  • 在计算机拍照在哪里进入
  • 使用cin.get()从c ++中的输入流中丢弃不需要的字符(Using cin.get() to discard unwanted characters from the input stream in c++)
  • No for循环将在for循环中运行。(No for loop will run inside for loop. Testing for primes)
  • 单页应用程序:页面重新加载(Single Page Application: page reload)
  • 在循环中选择具有相似模式的列名称(Selecting Column Name With Similar Pattern in a Loop)
  • System.StackOverflow错误(System.StackOverflow error)
  • KnockoutJS未在嵌套模板上应用beforeRemove和afterAdd(KnockoutJS not applying beforeRemove and afterAdd on nested templates)
  • 散列包括方法和/或嵌套属性(Hash include methods and/or nested attributes)
  • android - 如何避免使用Samsung RFS文件系统延迟/冻结?(android - how to avoid lag/freezes with Samsung RFS filesystem?)
  • TensorFlow:基于索引列表创建新张量(TensorFlow: Create a new tensor based on list of indices)
  • 企业安全培训的各项内容
  • 错误:RPC失败;(error: RPC failed; curl transfer closed with outstanding read data remaining)
  • C#类名中允许哪些字符?(What characters are allowed in C# class name?)
  • NumPy:将int64值存储在np.array中并使用dtype float64并将其转换回整数是否安全?(NumPy: Is it safe to store an int64 value in an np.array with dtype float64 and later convert it back to integer?)
  • 注销后如何隐藏导航portlet?(How to hide navigation portlet after logout?)
  • 将多个行和可变行移动到列(moving multiple and variable rows to columns)
  • 提交表单时忽略基础href,而不使用Javascript(ignore base href when submitting form, without using Javascript)
  • 对setOnInfoWindowClickListener的意图(Intent on setOnInfoWindowClickListener)
  • Angular $资源不会改变方法(Angular $resource doesn't change method)
  • 在Angular 5中不是一个函数(is not a function in Angular 5)
  • 如何配置Composite C1以将.m和桌面作为同一站点提供服务(How to configure Composite C1 to serve .m and desktop as the same site)
  • 不适用:悬停在悬停时:在元素之前[复制](Don't apply :hover when hovering on :before element [duplicate])
  • 常见的python rpc和cli接口(Common python rpc and cli interface)
  • Mysql DB单个字段匹配多个其他字段(Mysql DB single field matching to multiple other fields)
  • 产品页面上的Magento Up出售对齐问题(Magento Up sell alignment issue on the products page)