首页 \ 问答 \ HDFS将本地文件放入hdfs,但得到了UnresolvedAddressException(HDFS put a local file to hdfs but got UnresolvedAddressException)

HDFS将本地文件放入hdfs,但得到了UnresolvedAddressException(HDFS put a local file to hdfs but got UnresolvedAddressException)

我想把一个70G的文件放到hdfs中,所以我用'put'命令来做到这一点。 但是,我得到了以下例外。 我用相同的命令尝试了小文件,它工作。 有谁知道是什么问题? 谢谢!

WARN  [DataStreamer for file /user/qzhao/data/sorted/WGC033800D_sorted.bam._COPYING_] hdfs.DFSClient (DFSOutputStream.java:run(628)) - DataStreamer Exception java.nio.channels.UnresolvedAddressException
    at sun.nio.ch.Net.checkAddress(Net.java:127)
    at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:644)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
    at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1526)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1328)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1281)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:526)
put: java.nio.channels.ClosedChannelException
    at org.apache.hadoop.hdfs.DFSOutputStream.checkClosed(DFSOutputStream.java:1538)
    at org.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:98)
    at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:58)
    at java.io.DataOutputStream.write(DataOutputStream.java:107)
    at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:80)
    at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:52)
    at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:112)
    at org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:395)
    at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:327)
    at org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:303)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:243)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:228)
    at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:306)
    at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:278)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:223)
    at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:260)
    at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:244)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:200)
    at org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:259)
    at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190)
    at org.apache.hadoop.fs.shell.Command.run(Command.java:154)
    at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)

I want to put a 70G file to hdfs so I used 'put' command to do this. However, I got the following exception. I tried small size file with the same command, it works. Does anyone know what the problem could be? Thanks!

WARN  [DataStreamer for file /user/qzhao/data/sorted/WGC033800D_sorted.bam._COPYING_] hdfs.DFSClient (DFSOutputStream.java:run(628)) - DataStreamer Exception java.nio.channels.UnresolvedAddressException
    at sun.nio.ch.Net.checkAddress(Net.java:127)
    at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:644)
    at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:192)
    at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
    at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1526)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1328)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1281)
    at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:526)
put: java.nio.channels.ClosedChannelException
    at org.apache.hadoop.hdfs.DFSOutputStream.checkClosed(DFSOutputStream.java:1538)
    at org.apache.hadoop.fs.FSOutputSummer.write(FSOutputSummer.java:98)
    at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:58)
    at java.io.DataOutputStream.write(DataOutputStream.java:107)
    at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:80)
    at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:52)
    at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:112)
    at org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWithDestination.java:395)
    at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.java:327)
    at org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:303)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:243)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:228)
    at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:306)
    at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:278)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.java:223)
    at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:260)
    at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:244)
    at org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:200)
    at org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:259)
    at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190)
    at org.apache.hadoop.fs.shell.Command.run(Command.java:154)
    at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
    at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)

原文:https://stackoverflow.com/questions/36926453
更新时间:2023-02-14 09:02

最满意答案

使用@JoinColumn而不是@Column

@ManyToOne
@JoinColumn(name="LicenseeFK")
private Licensee licensee;

Use @JoinColumn instead of @Column:

@ManyToOne
@JoinColumn(name="LicenseeFK")
private Licensee licensee;

相关问答

更多
  • 您的实体字段是映射的反面,因此您不必使用MappedBy声明,而是使用它 /** * Inversed side * @var int|null * @ORM\ManyToOne(targetEntity="BaseValue", inversedBy="field") * @ORM\JoinColumn(name="[your_name]", referencedColumnName="[id]", onDelete="CASCADE") */ ...
  • 尝试从过滤器定义中删除stock @Filter(name="stockDailyRecordFilter", condition="name = 'My stock'") 更新因此这里的股票是相关表而不是主表。 您可以尝试加入以使用IN进行子选择 condition="stock_id in (select id from stock where name = 'My stock')" Try to remove the stock from filter definition @Filter(nam ...
  • 使用@JoinColumn而不是@Column : @ManyToOne @JoinColumn(name="LicenseeFK") private Licensee licensee; Use @JoinColumn instead of @Column: @ManyToOne @JoinColumn(name="LicenseeFK") private Licensee licensee;
  • 使用@JoinColumn(updatable = false)代替@Column(updatable = false) @JoinColumn(updatable = false) @Column(updatable = false) 。 Use @JoinColumn(updatable = false) instead of @Column(updatable = false).
  • 好吧,你可以简化你的代码,有这样的东西 @Transactional public void save(User user, String name) { Hometown hometown = getEntityManager().createQuery("SELECT h FROM Hometown h WHERE h.name = :name", Hometown.class).setParameter("name", name).getSingleResult(); if (home ...
  • 您不能使用@AttributeOverride重命名外键列。 你必须使用@AssosiactionOverride @Entity @Table(name = "TEST") public class B { public long id; @AssociationOverride(name = "classB", joinColumns = @JoinColumn(name = "EMBEDDED1_ID")) @AttributeOverrides({ ...
  • 您已在第3行条件中添加了"company.companyName "这样的空格。 这是错字吗? 如果不是那么它就是问题的原因。 I found the source of problem, it is a stupid error on my part, in my original code I put compny.companyName, I forgot an "a" in company, it works very well.
  • 谢谢您的回答。 我通过添加这些属性解决了这个问题spring.jpa.hibernate.ddl-auto = update spring.jpa.generate-ddl = true问题是spring-data无法更新架构。 Thank you for your answers. I solved this issue by adding those properties spring.jpa.hibernate.ddl-auto=update spring.jpa.generate-ddl=true ...
  • 1)在City类中添加City city字段并在其上面添加@ManyToOne和@JoinColumn anotations? 因此,我们将有两个表:country和city,city table将有country_id列。 我想你的意思是在City类中添加Country country字段,是的,如果你的目标是单向关系,这是正确的,但是joinColumn不应该在拥有实体中,它应该在没有拥有的实体中,所以你会有转到Country类并在那里添加一个城市列表,并使用带有连接列的@OneToMany对它们进行注 ...
  • 我可能会使用EventSubscriber。 订阅preUpdate事件并检查您提到的条件。 I'd probably go with EventSubscriber on this. Subscribe to preUpdate event and check for the conditions you mentioned.

相关文章

更多

最新问答

更多
  • 获取MVC 4使用的DisplayMode后缀(Get the DisplayMode Suffix being used by MVC 4)
  • 如何通过引用返回对象?(How is returning an object by reference possible?)
  • 矩阵如何存储在内存中?(How are matrices stored in memory?)
  • 每个请求的Java新会话?(Java New Session For Each Request?)
  • css:浮动div中重叠的标题h1(css: overlapping headlines h1 in floated divs)
  • 无论图像如何,Caffe预测同一类(Caffe predicts same class regardless of image)
  • xcode语法颜色编码解释?(xcode syntax color coding explained?)
  • 在Access 2010 Runtime中使用Office 2000校对工具(Use Office 2000 proofing tools in Access 2010 Runtime)
  • 从单独的Web主机将图像传输到服务器上(Getting images onto server from separate web host)
  • 从旧版本复制文件并保留它们(旧/新版本)(Copy a file from old revision and keep both of them (old / new revision))
  • 西安哪有PLC可控制编程的培训
  • 在Entity Framework中选择基类(Select base class in Entity Framework)
  • 在Android中出现错误“数据集和渲染器应该不为null,并且应该具有相同数量的系列”(Error “Dataset and renderer should be not null and should have the same number of series” in Android)
  • 电脑二级VF有什么用
  • Datamapper Ruby如何添加Hook方法(Datamapper Ruby How to add Hook Method)
  • 金华英语角.
  • 手机软件如何制作
  • 用于Android webview中图像保存的上下文菜单(Context Menu for Image Saving in an Android webview)
  • 注意:未定义的偏移量:PHP(Notice: Undefined offset: PHP)
  • 如何读R中的大数据集[复制](How to read large dataset in R [duplicate])
  • Unity 5 Heighmap与地形宽度/地形长度的分辨率关系?(Unity 5 Heighmap Resolution relationship to terrain width / terrain length?)
  • 如何通知PipedOutputStream线程写入最后一个字节的PipedInputStream线程?(How to notify PipedInputStream thread that PipedOutputStream thread has written last byte?)
  • python的访问器方法有哪些
  • DeviceNetworkInformation:哪个是哪个?(DeviceNetworkInformation: Which is which?)
  • 在Ruby中对组合进行排序(Sorting a combination in Ruby)
  • 网站开发的流程?
  • 使用Zend Framework 2中的JOIN sql检索数据(Retrieve data using JOIN sql in Zend Framework 2)
  • 条带格式类型格式模式编号无法正常工作(Stripes format type format pattern number not working properly)
  • 透明度错误IE11(Transparency bug IE11)
  • linux的基本操作命令。。。