首页 \ 问答 \ Hadoop中的ClassNotFoundException(ClassNotFoundException in Hadoop)

Hadoop中的ClassNotFoundException(ClassNotFoundException in Hadoop)

使用Hadoop mapreduce我正在编写代码以获得不同长度的子字符串。 示例字符串“ZYXCBA”和长度3.我的代码必须返回所有可能的长度为3的字符串(“ZYX”,“YXC”,“XCB”,“CBA”),长度为4(“ZYXC”,“YXCB”, “XCBA”)最终长度为5(“ZYXCB”,“YXCBA”)。

在地图阶段,我做了以下事情:

key =我想要的子串的长度

value =“ZYXCBA”。

因此映射器输出是

3,"ZYXCBA"
4,"ZYXCBA"
5,"ZYXCBA"

在reduce中,我使用字符串(“ZYXCBA”)和键3来获得长度为3的所有子串。对于4,5,也会出现相同的情况。 结果收集在ArrayList中。

我正在使用以下命令运行我的代码:

hduser@Ganesh:~/Documents$ hadoop jar Saishingles.jar hadoopshingles.Saishingles Behara/Shingles/input Behara/Shingles/output

我的代码如下所示::

package hadoopshingles;

import java.io.IOException;

import java.util.ArrayList;

import org.apache.hadoop.fs.Path; 

import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Mapper;

import org.apache.hadoop.mapreduce.Reducer;

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.mapreduce.Job;

import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;

import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;


public class Saishingles{

public static class shinglesmapper extends Mapper<Object, Text, IntWritable, Text>{

        public void map(Object key, Text value, Context context
                ) throws IOException, InterruptedException {

            String str = new String(value.toString());
            String[] list = str.split(" ");
            int index = Integer.parseInt(list[0]);
            String val = list[1];
            int length = val.length();
            for(int i = index; i <= length; i++)
            {
                context.write(new IntWritable(index),new Text(val));
            }       
        }

     }


public static class shinglesreducer extends Reducer<IntWritable,Text,IntWritable,ArrayList<String>> {
    private ArrayList<String> result = new ArrayList<String>();

    public void reduce(IntWritable key, Text value, Context context
            ) throws IOException, InterruptedException {
        String str = new String(value.toString());
        int newkey = key.get();
        int Tz = str.length() - newkey + 1;
        int position = 0;
        while (position <= Tz)
        {
            result.add(str.substring(position,position + newkey -1));
            position = position + 1;
        }   
        context.write(new IntWritable(newkey),result);
    }
}





public static void main(String[] args) throws Exception {

      Configuration conf = new Configuration();
      Job job = Job.getInstance(conf, "Saishingles");
      job.setJarByClass(hadoopshingles.Saishingles.class);
      job.setMapperClass(shinglesmapper.class);
      job.setCombinerClass(shinglesreducer.class);
      job.setReducerClass(shinglesreducer.class);
      job.setMapOutputKeyClass(IntWritable.class);
      job.setMapOutputValueClass(Text.class);
      job.setOutputKeyClass(IntWritable.class);
      job.setOutputValueClass(ArrayList.class);
      FileInputFormat.addInputPath(job, new Path(args[0]));
      FileOutputFormat.setOutputPath(job, new Path(args[1]));
      System.exit(job.waitForCompletion(true) ? 0 : 1);

}

}

它给出了以下错误:

Exception in thread "main" java.lang.ClassNotFoundException: hadoopshingles.Saishingles
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:278)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:214)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

请帮助我,并提前谢谢你:)


Using Hadoop mapreduce I am writing code to get substrings of different lengths. Example given string "ZYXCBA" and length 3. My code has to return all possible strings of length 3 ("ZYX","YXC","XCB","CBA"), length 4("ZYXC","YXCB","XCBA") finally length 5("ZYXCB","YXCBA").

In map phase I did the following:

key = length of substrings I want

value = "ZYXCBA".

So mapper output is

3,"ZYXCBA"
4,"ZYXCBA"
5,"ZYXCBA"

In reduce I take string ("ZYXCBA") and key 3 to get all substrings of length 3. Same occurs for 4,5. Results are collected in an ArrayList.

I am running my code using following command:

hduser@Ganesh:~/Documents$ hadoop jar Saishingles.jar hadoopshingles.Saishingles Behara/Shingles/input Behara/Shingles/output

My code is as shown below ::

package hadoopshingles;

import java.io.IOException;

import java.util.ArrayList;

import org.apache.hadoop.fs.Path; 

import org.apache.hadoop.io.IntWritable;

import org.apache.hadoop.io.Text;

import org.apache.hadoop.mapreduce.Mapper;

import org.apache.hadoop.mapreduce.Reducer;

import org.apache.hadoop.conf.Configuration;

import org.apache.hadoop.mapreduce.Job;

import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;

import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;


public class Saishingles{

public static class shinglesmapper extends Mapper<Object, Text, IntWritable, Text>{

        public void map(Object key, Text value, Context context
                ) throws IOException, InterruptedException {

            String str = new String(value.toString());
            String[] list = str.split(" ");
            int index = Integer.parseInt(list[0]);
            String val = list[1];
            int length = val.length();
            for(int i = index; i <= length; i++)
            {
                context.write(new IntWritable(index),new Text(val));
            }       
        }

     }


public static class shinglesreducer extends Reducer<IntWritable,Text,IntWritable,ArrayList<String>> {
    private ArrayList<String> result = new ArrayList<String>();

    public void reduce(IntWritable key, Text value, Context context
            ) throws IOException, InterruptedException {
        String str = new String(value.toString());
        int newkey = key.get();
        int Tz = str.length() - newkey + 1;
        int position = 0;
        while (position <= Tz)
        {
            result.add(str.substring(position,position + newkey -1));
            position = position + 1;
        }   
        context.write(new IntWritable(newkey),result);
    }
}





public static void main(String[] args) throws Exception {

      Configuration conf = new Configuration();
      Job job = Job.getInstance(conf, "Saishingles");
      job.setJarByClass(hadoopshingles.Saishingles.class);
      job.setMapperClass(shinglesmapper.class);
      job.setCombinerClass(shinglesreducer.class);
      job.setReducerClass(shinglesreducer.class);
      job.setMapOutputKeyClass(IntWritable.class);
      job.setMapOutputValueClass(Text.class);
      job.setOutputKeyClass(IntWritable.class);
      job.setOutputValueClass(ArrayList.class);
      FileInputFormat.addInputPath(job, new Path(args[0]));
      FileOutputFormat.setOutputPath(job, new Path(args[1]));
      System.exit(job.waitForCompletion(true) ? 0 : 1);

}

}

it's giving the following error :

Exception in thread "main" java.lang.ClassNotFoundException: hadoopshingles.Saishingles
    at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:278)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:214)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

please help me and thank you in advance :)


原文:https://stackoverflow.com/questions/38478737
更新时间:2022-01-12 07:01

最满意答案

使用uid(唯一ID)作为您的pk,这将保证在任何地方创建唯一条目,因此可以进行组合,更新等。

java.util.UUID.randomUUID();

Use uids (unique ids) as your pk, this will guarantee unique entries wherever they are created so combining, updating etc can be done.

java.util.UUID.randomUUID();

相关问答

更多

相关文章

更多

最新问答

更多
  • 您如何使用git diff文件,并将其应用于同一存储库的副本的本地分支?(How do you take a git diff file, and apply it to a local branch that is a copy of the same repository?)
  • 将长浮点值剪切为2个小数点并复制到字符数组(Cut Long Float Value to 2 decimal points and copy to Character Array)
  • OctoberCMS侧边栏不呈现(OctoberCMS Sidebar not rendering)
  • 页面加载后对象是否有资格进行垃圾回收?(Are objects eligible for garbage collection after the page loads?)
  • codeigniter中的语言不能按预期工作(language in codeigniter doesn' t work as expected)
  • 在计算机拍照在哪里进入
  • 使用cin.get()从c ++中的输入流中丢弃不需要的字符(Using cin.get() to discard unwanted characters from the input stream in c++)
  • No for循环将在for循环中运行。(No for loop will run inside for loop. Testing for primes)
  • 单页应用程序:页面重新加载(Single Page Application: page reload)
  • 在循环中选择具有相似模式的列名称(Selecting Column Name With Similar Pattern in a Loop)
  • System.StackOverflow错误(System.StackOverflow error)
  • KnockoutJS未在嵌套模板上应用beforeRemove和afterAdd(KnockoutJS not applying beforeRemove and afterAdd on nested templates)
  • 散列包括方法和/或嵌套属性(Hash include methods and/or nested attributes)
  • android - 如何避免使用Samsung RFS文件系统延迟/冻结?(android - how to avoid lag/freezes with Samsung RFS filesystem?)
  • TensorFlow:基于索引列表创建新张量(TensorFlow: Create a new tensor based on list of indices)
  • 企业安全培训的各项内容
  • 错误:RPC失败;(error: RPC failed; curl transfer closed with outstanding read data remaining)
  • C#类名中允许哪些字符?(What characters are allowed in C# class name?)
  • NumPy:将int64值存储在np.array中并使用dtype float64并将其转换回整数是否安全?(NumPy: Is it safe to store an int64 value in an np.array with dtype float64 and later convert it back to integer?)
  • 注销后如何隐藏导航portlet?(How to hide navigation portlet after logout?)
  • 将多个行和可变行移动到列(moving multiple and variable rows to columns)
  • 提交表单时忽略基础href,而不使用Javascript(ignore base href when submitting form, without using Javascript)
  • 对setOnInfoWindowClickListener的意图(Intent on setOnInfoWindowClickListener)
  • Angular $资源不会改变方法(Angular $resource doesn't change method)
  • 在Angular 5中不是一个函数(is not a function in Angular 5)
  • 如何配置Composite C1以将.m和桌面作为同一站点提供服务(How to configure Composite C1 to serve .m and desktop as the same site)
  • 不适用:悬停在悬停时:在元素之前[复制](Don't apply :hover when hovering on :before element [duplicate])
  • 常见的python rpc和cli接口(Common python rpc and cli interface)
  • Mysql DB单个字段匹配多个其他字段(Mysql DB single field matching to multiple other fields)
  • 产品页面上的Magento Up出售对齐问题(Magento Up sell alignment issue on the products page)