首先,Context是一个内部类,从哪看出来的呢?如下

public class Mapper<KEYIN, VALUEIN, KEYOUT, VALUEOUT> {

  /**
   * The <code>Context</code> passed on to the {@link Mapper} implementations.
   */
  public abstract class Context
    implements MapContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT> {
  }
  }

内部类有啥好处呢?也就是Mapper类的参数那么Context可以很方便的访问到.
在这里插入图片描述
Context类实现了MapContext接口.
看一下.有一个获取分片的方法.至于context怎么来的,还需要继续溯源

public interface MapContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT> 
  extends TaskInputOutputContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT> {

  /**
   * Get the input split for this map.
   */
  public InputSplit getInputSplit();
  
}

获取是否有下一个键值对,以及当前键值对.还有写入方法在这里定义了,就是把k2,v2写出去.写到哪里呢?

public interface TaskInputOutputContext<KEYIN,VALUEIN,KEYOUT,VALUEOUT> 
       extends TaskAttemptContext {

  /**
   * Advance to the next key, value pair, returning null if at end.
   * @return the key object that was read into, or null if no more
   */
  public boolean nextKeyValue() throws IOException, InterruptedException;
 
  /**
   * Get the current key.
   * @return the current key object or null if there isn't one
   * @throws IOException
   * @throws InterruptedException
   */
  public KEYIN getCurrentKey() throws IOException, InterruptedException;

  /**
   * Get the current value.
   * @return the value object that was read into
   * @throws IOException
   * @throws InterruptedException
   */
  public VALUEIN getCurrentValue() throws IOException, InterruptedException;

  /**
   * Generate an output key/value pair.
   */
  public void write(KEYOUT key, VALUEOUT value) 
      throws IOException, InterruptedException;

  /**
   * Get the {@link OutputCommitter} for the task-attempt.
   * @return the <code>OutputCommitter</code> for the task-attempt
   */
  public OutputCommitter getOutputCommitter();
}

继续溯源.TaskInputOutputContext又继承了TaskAttemptContext接口,这个接口中包含一些job信息.

public interface TaskAttemptContext extends JobContext, Progressable {

  /**
   * Get the unique name for this task attempt.
   */
  public TaskAttemptID getTaskAttemptID();

  /**
   * Set the current status of the task to the given string.
   */
  public void setStatus(String msg);

  /**
   * Get the last set status message.
   * @return the current status message
   */
  public String getStatus();
  
  /**
   * The current progress of the task attempt.
   * @return a number between 0.0 and 1.0 (inclusive) indicating the attempt's
   * progress.
   */
  public abstract float getProgress();

  /**
   * Get the {@link Counter} for the given <code>counterName</code>.
   * @param counterName counter name
   * @return the <code>Counter</code> for the given <code>counterName</code>
   */
  public Counter getCounter(Enum<?> counterName);

  /**
   * Get the {@link Counter} for the given <code>groupName</code> and 
   * <code>counterName</code>.
   * @param counterName counter name
   * @return the <code>Counter</code> for the given <code>groupName</code> and 
   *         <code>counterName</code>
   */
  public Counter getCounter(String groupName, String counterName);

}

继续溯源

public interface JobContext extends MRJobConfig {
  /**
   * Return the configuration for the job.
   * @return the shared configuration object
   */
  public Configuration getConfiguration();

  /**
   * Get credentials for the job.
   * @return credentials for the job
   */
  public Credentials getCredentials();

  /**
   * Get the unique ID for the job.
   * @return the object with the job id
   */
  public JobID getJobID();
  
  /**
   * Get configured the number of reduce tasks for this job. Defaults to 
   * <code>1</code>.
   * @return the number of reduce tasks for this job.
   */
  public int getNumReduceTasks();
  
  /**
   * Get the current working directory for the default file system.
   * 
   * @return the directory name.
   */
  public Path getWorkingDirectory() throws IOException;

  /**
   * Get the key class for the job output data.
   * @return the key class for the job output data.
   */
  public Class<?> getOutputKeyClass();
  
  /**
   * Get the value class for job outputs.
   * @return the value class for job outputs.
   */
  public Class<?> getOutputValueClass();

  /**
   * Get the key class for the map output data. If it is not set, use the
   * (final) output key class. This allows the map output key class to be
   * different than the final output key class.
   * @return the map output key class.
   */
  public Class<?> getMapOutputKeyClass();

  /**
   * Get the value class for the map output data. If it is not set, use the
   * (final) output value class This allows the map output value class to be
   * different than the final output value class.
   *  
   * @return the map output value class.
   */
  public Class<?> getMapOutputValueClass();

  /**
   * Get the user-specified job name. This is only used to identify the 
   * job to the user.
   * 
   * @return the job's name, defaulting to "".
   */
  public String getJobName();

  /**
   * Get the {@link InputFormat} class for the job.
   * 
   * @return the {@link InputFormat} class for the job.
   */
  public Class<? extends InputFormat<?,?>> getInputFormatClass() 
     throws ClassNotFoundException;

  /**
   * Get the {@link Mapper} class for the job.
   * 
   * @return the {@link Mapper} class for the job.
   */
  public Class<? extends Mapper<?,?,?,?>> getMapperClass() 
     throws ClassNotFoundException;

  /**
   * Get the combiner class for the job.
   * 
   * @return the combiner class for the job.
   */
  public Class<? extends Reducer<?,?,?,?>> getCombinerClass() 
     throws ClassNotFoundException;

  /**
   * Get the {@link Reducer} class for the job.
   * 
   * @return the {@link Reducer} class for the job.
   */
  public Class<? extends Reducer<?,?,?,?>> getReducerClass() 
     throws ClassNotFoundException;

  /**
   * Get the {@link OutputFormat} class for the job.
   * 
   * @return the {@link OutputFormat} class for the job.
   */
  public Class<? extends OutputFormat<?,?>> getOutputFormatClass() 
     throws ClassNotFoundException;

  /**
   * Get the {@link Partitioner} class for the job.
   * 
   * @return the {@link Partitioner} class for the job.
   */
  public Class<? extends Partitioner<?,?>> getPartitionerClass() 
     throws ClassNotFoundException;

  /**
   * Get the {@link RawComparator} comparator used to compare keys.
   * 
   * @return the {@link RawComparator} comparator used to compare keys.
   */
  public RawComparator<?> getSortComparator();

  /**
   * Get the pathname of the job's jar.
   * @return the pathname
   */
  public String getJar();

  /**
   * Get the user defined {@link RawComparator} comparator for
   * grouping keys of inputs to the combiner.
   *
   * @return comparator set by the user for grouping values.
   * @see Job#setCombinerKeyGroupingComparatorClass(Class)
   */
  public RawComparator<?> getCombinerKeyGroupingComparator();

    /**
     * Get the user defined {@link RawComparator} comparator for
     * grouping keys of inputs to the reduce.
     *
     * @return comparator set by the user for grouping values.
     * @see Job#setGroupingComparatorClass(Class)
     * @see #getCombinerKeyGroupingComparator()
     */
  public RawComparator<?> getGroupingComparator();
  
  /**
   * Get whether job-setup and job-cleanup is needed for the job 
   * 
   * @return boolean 
   */
  public boolean getJobSetupCleanupNeeded();
  
  /**
   * Get whether task-cleanup is needed for the job 
   * 
   * @return boolean 
   */
  public boolean getTaskCleanupNeeded();

  /**
   * Get whether the task profiling is enabled.
   * @return true if some tasks will be profiled
   */
  public boolean getProfileEnabled();

  /**
   * Get the profiler configuration arguments.
   *
   * The default value for this property is
   * "-agentlib:hprof=cpu=samples,heap=sites,force=n,thread=y,verbose=n,file=%s"
   * 
   * @return the parameters to pass to the task child to configure profiling
   */
  public String getProfileParams();

  /**
   * Get the range of maps or reduces to profile.
   * @param isMap is the task a map?
   * @return the task ranges
   */
  public IntegerRanges getProfileTaskRange(boolean isMap);

  /**
   * Get the reported username for this job.
   * 
   * @return the username
   */
  public String getUser();
  
  /**
   * Originally intended to check if symlinks should be used, but currently
   * symlinks cannot be disabled.
   * @return true
   */
  @Deprecated
  public boolean getSymlink();
  
  /**
   * Get the archive entries in classpath as an array of Path
   */
  public Path[] getArchiveClassPaths();

  /**
   * Get cache archives set in the Configuration
   * @return A URI array of the caches set in the Configuration
   * @throws IOException
   */
  public URI[] getCacheArchives() throws IOException;

  /**
   * Get cache files set in the Configuration
   * @return A URI array of the files set in the Configuration
   * @throws IOException
   */

  public URI[] getCacheFiles() throws IOException;

  /**
   * Return the path array of the localized caches
   * @return A path array of localized caches
   * @throws IOException
   * @deprecated the array returned only includes the items the were 
   * downloaded. There is no way to map this to what is returned by
   * {@link #getCacheArchives()}.
   */
  @Deprecated
  public Path[] getLocalCacheArchives() throws IOException;

  /**
   * Return the path array of the localized files
   * @return A path array of localized files
   * @throws IOException
   * @deprecated the array returned only includes the items the were 
   * downloaded. There is no way to map this to what is returned by
   * {@link #getCacheFiles()}.
   */
  @Deprecated
  public Path[] getLocalCacheFiles() throws IOException;

  /**
   * Get the file entries in classpath as an array of Path
   */
  public Path[] getFileClassPaths();
  
  /**
   * Get the timestamps of the archives.  Used by internal
   * DistributedCache and MapReduce code.
   * @return a string array of timestamps 
   */
  public String[] getArchiveTimestamps();

  /**
   * Get the timestamps of the files.  Used by internal
   * DistributedCache and MapReduce code.
   * @return a string array of timestamps 
   */
  public String[] getFileTimestamps();

  /** 
   * Get the configured number of maximum attempts that will be made to run a
   * map task, as specified by the <code>mapred.map.max.attempts</code>
   * property. If this property is not already set, the default is 4 attempts.
   *  
   * @return the max number of attempts per map task.
   */
  public int getMaxMapAttempts();

  /** 
   * Get the configured number of maximum attempts  that will be made to run a
   * reduce task, as specified by the <code>mapred.reduce.max.attempts</code>
   * property. If this property is not already set, the default is 4 attempts.
   * 
   * @return the max number of attempts per reduce task.
   */
  public int getMaxReduceAttempts();

}

总结一下,就是

  • context是一个Mapper的内部类
  • context可以获取一些Job相关的信息.
Logo

为开发者提供学习成长、分享交流、生态实践、资源工具等服务,帮助开发者快速成长。

更多推荐