简介
历史服务的REST API允许用户获取所有已经运行完成的application的状态
历史服务器信息的API(History Server Information API)
该API提供关于历史服务器的总体信息
URI
下面两个URI都从appid值标识的应用程序id提供历史服务器信息。
* http://<history server http address:port>/ws/v1/history
* http://<history server http address:port>/ws/v1/history/infoHTTP操作支持
GET查询参数支持
NonehistoryInfo对象的元素
| 名称 | 描述 | 
|---|---|
| startedOn | 历史服务启动的时间(ms为单位) | 
| hadoopVersion | hadoop公共版本 | 
| hadoopBuildVersion | Hadoop被build的版本 | 
| hadoopVersionBuiltOn | 构建hadoop common时的时间戳 | 
请求实例
在这里就以JSON response为例
HTTP Request:
GET http://<history server http address:port>/ws/v1/history/infoResponse Header:
HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)Response Body:
{
   "historyInfo" : {
      "startedOn":1353512830963,
      "hadoopVersionBuiltOn" : "Wed Jan 11 21:18:36 UTC 2012",
      "hadoopBuildVersion" : "0.23.1-SNAPSHOT from 1230253 by user1 source checksum bb6e554c6d50b0397d826081017437a7",
      "hadoopVersion" : "0.23.1-SNAPSHOT"
   }
}MapReduce API’s
下面的列表资源应用于mapreduce
Jobs API
作业资源提供了已完成的MapReduce作业的列表。它当前不返回完整的参数列表
- URI
 *  http://<history server http address:port>/ws/v1/history/mapreduce/jobs- HTTP操作支持
GET- 查询参数支持
可以指定多个参数。开始时间和结束时间有一个开始和结束参数,允许您指定范围。例如,可以请求在2011年12月19日凌晨1点到下午2点之间开始的所有作业,startedTimeBegin=1324256400&startedTimeEnd=1324303200。如果Begin参数未指定,则默认值为0,如果End参数未指定,则默认值为无穷大。
  * user - user name
  * state - the job state
  * queue - queue name
  * limit - total number of app objects to be returned
  * startedTimeBegin - jobs with start time beginning with this time, specified in ms since epoch
  * startedTimeEnd - jobs with start time ending with this time, specified in ms since epoch
  * finishedTimeBegin - jobs with finish time beginning with this time, specified in ms since epoch
  * finishedTimeEnd - jobs with finish time ending with this time, specified in ms since epoch- jobs对象内的元素
当您请求作业列表时,信息将作为作业对象数组返回。有关作业对象的语法,请参见作业API。除了这是一个完整作业的子集。只返回startTime、finishTime、id、name、queue、user、state、mapsTotal、mapsCompleted、reducesTotal和reducesCompleted。
| 名称 | 描述 | 
|---|---|
| job | job对象的一个集合 | 
- 请求实例
在这里就以JSON response为例
HTTP Request:
GET http://<history server http address:port>/ws/v1/history/mapreduce/jobsResponse Header:
  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)Response Body:
{
   "jobs" : {
      "job" : [
         {
            "submitTime" : 1326381344449, --提交时间
            "state" : "SUCCEEDED", --运行状态
            "user" : "user1", --用户
            "reducesTotal" : 1, --reduce的个数
            "mapsCompleted" : 1, --map的完成数
            "startTime" : 1326381344489, --开始时间
            "id" : "job_1326381300833_1_1", --job的id
            "name" : "word count", --job的名字
            "reducesCompleted" : 1, --reduce的完成数
            "mapsTotal" : 1, --map的个数
            "queue" : "default", --默认队列
            "finishTime" : 1326381356010 --结束时间
         },
         {
            "submitTime" : 1326381446500
            "state" : "SUCCEEDED",
            "user" : "user1",
            "reducesTotal" : 1,
            "mapsCompleted" : 1,
            "startTime" : 1326381446529,
            "id" : "job_1326381300833_2_2",
            "name" : "Sleep job",
            "reducesCompleted" : 1,
            "mapsTotal" : 1,
            "queue" : "default",
            "finishTime" : 1326381582106
         }
      ]
   }
}
Job API
该API包含关于jobid标识的特定作业的信息。
- URI
* http://<history server http address:port>/ws/v1/history/mapreduce/jobs/{jobid}- HTTP操作支持
GET- 查询参数支持
None- job对象包含的元素
| 名称 | 描述 | 
|---|---|
| id | job的id | 
| name | job的名称 | 
| queue | job提交是的队列 | 
| user | 用户名 | 
| state | job的状态 包括NEW, INITED, RUNNING, SUCCEEDED, FAILED, KILL_WAIT, KILLED, ERROR | 
| diagnostics | 诊断信息 | 
| submitTime | 作业提交的时间(MS单位) | 
| startTime | job开始的时间(MS单位) | 
| finishTime | job结束时间 | 
| mapsTotal | map的总共大小 | 
| mapsCompleted | map完成的大小 | 
| reducesTotal | reduce的总共大小 | 
| reducesCompleted | reduce完成的大小 | 
| uberized | 指示作业是否是uber作业——在应用程序主程序中完全运行 | 
| avgMapTime | map任务的平均时间 (MS单位) | 
| avgReduceTime | reduce任务的平均时间 (MS单位) | 
| avgShuffleTime | shuffle任务的平均时间(MS单位) | 
| avgMergeTime | merge任务的平均时间(MS单位) | 
| failedReduceAttempts | 失败的reduce的次数 | 
| killedReduceAttempts | kill掉的reduce的次数 | 
| successfulReduceAttempts | 成功的reduce的次数 | 
| failedMapAttempts | 失败的map的次数 | 
| killedMapAttempts | kill掉的map的次数 | 
| successfulMapAttempts | 成功的map的次数 | 
| acls | acls对象的集合 | 
acls对象内的元素
| 名称 | 描述 | 
|---|---|
| value | |
| name | 
- 请求实例
在这里就以JSON response为例
HTTP Request:
GET http://<history server http address:port>/ws/v1/history/mapreduce/jobs/job_1326381300833_2_2Response Header:
  HTTP/1.1 200 OK
  Content-Type: application/json
  Server: Jetty(6.1.26)
  Content-Length: 720Response Body:
{
   "job" : {
      "submitTime":  1326381446500,
      "avgReduceTime" : 124961,
      "failedReduceAttempts" : 0,
      "state" : "SUCCEEDED",
      "successfulReduceAttempts" : 1,
      "acls" : [
         {
            "value" : " ",
            "name" : "mapreduce.job.acl-modify-job"
         },
         {
            "value" : " ",
            "name" : "mapreduce.job.acl-view-job"
         }
      ],
      "user" : "user1",
      "reducesTotal" : 1,
      "mapsCompleted" : 1,
      "startTime" : 1326381446529,
      "id" : "job_1326381300833_2_2",
      "avgMapTime" : 2638,
      "successfulMapAttempts" : 1,
      "name" : "Sleep job",
      "avgShuffleTime" : 2540,
      "reducesCompleted" : 1,
      "diagnostics" : "",
      "failedMapAttempts" : 0,
      "avgMergeTime" : 2589,
      "killedReduceAttempts" : 0,
      "mapsTotal" : 1,
      "queue" : "default",
      "uberized" : false,
      "killedMapAttempts" : 0,
      "finishTime" : 1326381582106
   }
}Job Attempts API
该API,您可以获得表示作业尝试的资源集合。在此资源上运行GET操作时,将获得作业尝试对象的集合。
- URI
* http://<history server http address:port>/ws/v1/history/mapreduce/jobs/{jobid}/jobattempts- HTTP操作支持
GET- 查询参数支持
NonejobAttempts对象内的元素是jobAttempt对象,通过上述URI获取的结果是jobAttempt集合
- jobAttempt对象内的元素
| 名称 | 描述 | 
|---|---|
| id | jobAttempt的id | 
| nodeId | 尝试运行的节点id | 
| nodeHttpAddress | 尝试运行的节点的地址 | 
| logsLink | 日志的http连接 | 
| containerId | job尝试的容器id | 
| startTime | 开始时间(MS单位) | 
- 请求实例
在这里就以JSON response为例
HTTP Request:
GET http://<history server http address:port>/ws/v1/history/mapreduce/jobs/job_1326381300833_2_2/jobattemptsResponse Header:
  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)Response Body:
{
   "jobAttempts" : {
      "jobAttempt" : [
         {
            "nodeId" : "host.domain.com:8041",
            "nodeHttpAddress" : "host.domain.com:8042",
            "startTime" : 1326381444693,
            "id" : 1,
            "logsLink" : "http://host.domain.com:19888/jobhistory/logs/host.domain.com:8041/container_1326381300833_0002_01_000001/job_1326381300833_2_2/user1",
            "containerId" : "container_1326381300833_0002_01_000001"
         }
      ]
   }
}Job Counters API
通过该API,可以用对象表示该作业的所有计数器的资源集合
- URI
 * http://<history server http address:port>/ws/v1/history/mapreduce/jobs/{jobid}/counters- HTTP操作支持
GET- 查询参数支持
None- jobCounters对象下的元素
| 名称 | 描述 | 
|---|---|
| id | job的id | 
| counterGroup | 计数器组对象的集合 | 
- counterGroup对象下的元素
| 名称 | 描述 | 
|---|---|
| counterGroupName | |
| counter | counter对象的集合 | 
- counter对象下的元素
| 名称 | 描述 | 
|---|---|
| name | counter的名字 | 
| reduceCounterValue | reduce任务的计数器值 | 
| mapCounterValue | map任务的计数器值 | 
| totalCounterValue | 所有任务的计数器值 | 
- 请求实例
在这里就以JSON response为例
HTTP Request:
GET http://<history server http address:port>/ws/v1/history/mapreduce/jobs/job_1326381300833_2_2/countersResponse Header:
  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)Response Body:
{
   "jobCounters" : {
      "id" : "job_1326381300833_2_2",
      "counterGroup" : [
         {
            "counterGroupName" : "Shuffle Errors",
            "counter" : [
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "BAD_ID"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "CONNECTION"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "IO_ERROR"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "WRONG_LENGTH"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "WRONG_MAP"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "WRONG_REDUCE"
               }
            ]
          },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.FileSystemCounter",
            "counter" : [
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 2483,
                  "name" : "FILE_BYTES_READ"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 108525,
                  "name" : "FILE_BYTES_WRITTEN"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "FILE_READ_OPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "FILE_LARGE_READ_OPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "FILE_WRITE_OPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 48,
                  "name" : "HDFS_BYTES_READ"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "HDFS_BYTES_WRITTEN"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1,
                  "name" : "HDFS_READ_OPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "HDFS_LARGE_READ_OPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "HDFS_WRITE_OPS"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.TaskCounter",
            "counter" : [
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1,
                  "name" : "MAP_INPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1200,
                  "name" : "MAP_OUTPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 4800,
                  "name" : "MAP_OUTPUT_BYTES"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 2235,
                  "name" : "MAP_OUTPUT_MATERIALIZED_BYTES"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 48,
                  "name" : "SPLIT_RAW_BYTES"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "COMBINE_INPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "COMBINE_OUTPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1200,
                  "name" : "REDUCE_INPUT_GROUPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 2235,
                  "name" : "REDUCE_SHUFFLE_BYTES"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1200,
                  "name" : "REDUCE_INPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "REDUCE_OUTPUT_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 2400,
                  "name" : "SPILLED_RECORDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1,
                  "name" : "SHUFFLED_MAPS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "FAILED_SHUFFLE"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1,
                  "name" : "MERGED_MAP_OUTPUTS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 113,
                  "name" : "GC_TIME_MILLIS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 1830,
                  "name" : "CPU_MILLISECONDS"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 478068736,
                  "name" : "PHYSICAL_MEMORY_BYTES"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 2159284224,
                  "name" : "VIRTUAL_MEMORY_BYTES"
               },
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 378863616,
                  "name" : "COMMITTED_HEAP_BYTES"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.lib.input.FileInputFormatCounter",
            "counter" : [
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "BYTES_READ"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.lib.output.FileOutputFormatCounter",
            "counter" : [
               {
                  "reduceCounterValue" : 0,
                  "mapCounterValue" : 0,
                  "totalCounterValue" : 0,
                  "name" : "BYTES_WRITTEN"
               }
            ]
         }
      ]
   }
}Job Conf API
该API包含关于此作业的作业配置的信息。
- URI
* http://<history server http address:port>/ws/v1/history/mapreduce/jobs/{jobid}/conf- HTTP操作支持
GET- 查询参数支持
None- conf对象下的元素
| 名称 | 描述 | 
|---|---|
| path | 作业配置文件的路径 | 
| property | 配置属性对象的集合 | 
- property对象下的元素
| 名称 | 描述 | 
|---|---|
| name | 配置属性的名称 | 
| value | 配置属性的值 | 
| source | 配置对象的位置。如果有多个,则在列表的末尾显示最新源的历史记录。 | 
- 请求实例
在这里就以JSON response为例
HTTP Request:
 GET http://<history server http address:port>/ws/v1/history/mapreduce/jobs/job_1326381300833_2_2/confResponse Header:
  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)Response Body:
这是输出的一个小片段,如果非常大,则输出。实际输出包含作业配置文件中的每个属性。
{
   "conf" : {
      "path" : "hdfs://host.domain.com:9000/user/user1/.staging/job_1326381300833_0002/job.xml",
      "property" : [
         {
            "value" : "/home/hadoop/hdfs/data",
            "name" : "dfs.datanode.data.dir"
            "source" : ["hdfs-site.xml", "job.xml"]
         },
         {
            "value" : "org.apache.hadoop.yarn.server.webproxy.amfilter.AmFilterInitializer",
            "name" : "hadoop.http.filter.initializers"
            "source" : ["programmatically", "job.xml"]
         },
         {
            "value" : "/home/hadoop/tmp",
            "name" : "mapreduce.cluster.temp.dir"
            "source" : ["mapred-site.xml"]
         },
         ...
      ]
   }
}Tasks API
您可以获得表示作业中的任务的资源集合。在此资源上运行GET操作时,将获得一组任务对象
- URI
  * http://<history server http address:port>/ws/v1/history/mapreduce/jobs/{jobid}/tasks- HTTP操作支持
GET- 查询参数支持
* type - type of task, valid values are m or r.  m for map task or r for reduce task.tasks对象下的元素是tastk的一个集合
- 请求实例
在这里就以JSON response为例
HTTP Request:
GET http://<history server http address:port>/ws/v1/history/mapreduce/jobs/job_1326381300833_2_2/tasksResponse Header:
  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)Response Body:
{
   "tasks" : {
      "task" : [
         {
            "progress" : 100,
            "elapsedTime" : 6777,
            "state" : "SUCCEEDED",
            "startTime" : 1326381446541,
            "id" : "task_1326381300833_2_2_m_0",
            "type" : "MAP",
            "successfulAttempt" : "attempt_1326381300833_2_2_m_0_0",
            "finishTime" : 1326381453318
         },
         {
            "progress" : 100,
            "elapsedTime" : 135559,
            "state" : "SUCCEEDED",
            "startTime" : 1326381446544,
            "id" : "task_1326381300833_2_2_r_0",
            "type" : "REDUCE",
            "successfulAttempt" : "attempt_1326381300833_2_2_r_0_0",
            "finishTime" : 1326381582103
         }
      ]
   }
}Task API
包含关于作业中特定任务的信息
- URI
 * http://<history server http address:port>/ws/v1/history/mapreduce/jobs/{jobid}/tasks/{taskid}- HTTP操作支持
GET- 查询参数支持
None- task对象下的元素
| 名称 | 描述 | 
|---|---|
| id | task id | 
| state | task的状态NEW, SCHEDULED, RUNNING, SUCCEEDED, FAILED, KILL_WAIT, KILLED | 
| type | task的类型- MAP or REDUCE | 
| successfulAttempt | 最新成功尝试的id | 
| progress | task的进度百分比 | 
| startTime | 任务启动的时间(MS单位,从epoch开始),如果从未启动则为-1 | 
| finishTime | 任务完成的时间(MS单位) | 
| elapsedTime | 应用程序启动以来的运行时间(以ms为单位) | 
- 请求实例
在这里就以JSON response为例
HTTP Request:
GET http://<history server http address:port>/ws/v1/history/mapreduce/jobs/job_1326381300833_2_2/tasks/task_1326381300833_2_2_m_0Response Header:
  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)Response Body:
{
   "task" : {
      "progress" : 100,
      "elapsedTime" : 6777,
      "state" : "SUCCEEDED",
      "startTime" : 1326381446541,
      "id" : "task_1326381300833_2_2_m_0",
      "type" : "MAP",
      "successfulAttempt" : "attempt_1326381300833_2_2_m_0_0",
      "finishTime" : 1326381453318
   }
}Task Counters API
可以对象表示该任务的所有计数器的资源集合
- URI
* http://<history server http address:port>/ws/v1/history/mapreduce/jobs/{jobid}/tasks/{taskid}/counters- HTTP操作支持
GET- 查询参数支持
NonejobTaskCounters对象下的元素有 id(task的id)、taskcounterGroup(counterGroup的一个集合)
counterGroup对象下元素有counterGroupName(counterGroup的名字)、counter(counter对象的一个集合)
counter对象下元素有name、vualue
- 请求实例
在这里就以JSON response为例
HTTP Request:
GET http://<history server http address:port>/ws/v1/history/mapreduce/jobs/job_1326381300833_2_2/tasks/task_1326381300833_2_2_m_0/countersResponse Header:
  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)Response Body:
{
   "jobTaskCounters" : {
      "id" : "task_1326381300833_2_2_m_0",
      "taskCounterGroup" : [
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.FileSystemCounter",
            "counter" : [
               {
                  "value" : 2363,
                  "name" : "FILE_BYTES_READ"
               },
               {
                  "value" : 54372,
                  "name" : "FILE_BYTES_WRITTEN"
               },
               {
                  "value" : 0,
                  "name" : "FILE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "FILE_LARGE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "FILE_WRITE_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_BYTES_READ"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_BYTES_WRITTEN"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_LARGE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_WRITE_OPS"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.TaskCounter",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "COMBINE_INPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "COMBINE_OUTPUT_RECORDS"
               },
               {
                  "value" : 460,
                  "name" : "REDUCE_INPUT_GROUPS"
               },
               {
                  "value" : 2235,
                  "name" : "REDUCE_SHUFFLE_BYTES"
               },
               {
                  "value" : 460,
                  "name" : "REDUCE_INPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "REDUCE_OUTPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "SPILLED_RECORDS"
               },
               {
                  "value" : 1,
                  "name" : "SHUFFLED_MAPS"
               },
               {
                  "value" : 0,
                  "name" : "FAILED_SHUFFLE"
               },
               {
                  "value" : 1,
                  "name" : "MERGED_MAP_OUTPUTS"
               },
               {
                  "value" : 26,
                  "name" : "GC_TIME_MILLIS"
               },
               {
                  "value" : 860,
                  "name" : "CPU_MILLISECONDS"
               },
               {
                  "value" : 107839488,
                  "name" : "PHYSICAL_MEMORY_BYTES"
               },
               {
                  "value" : 1123147776,
                  "name" : "VIRTUAL_MEMORY_BYTES"
               },
               {
                  "value" : 57475072,
                  "name" : "COMMITTED_HEAP_BYTES"
               }
            ]
         },
         {
            "counterGroupName" : "Shuffle Errors",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "BAD_ID"
               },
               {
                  "value" : 0,
                  "name" : "CONNECTION"
               },
               {
                  "value" : 0,
                  "name" : "IO_ERROR"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_LENGTH"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_MAP"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_REDUCE"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.lib.output.FileOutputFormatCounter",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "BYTES_WRITTEN"
               }
            ]
         }
      ]
   }
}Task Attempts API
- URI
 * http://<history server http address:port>/ws/v1/history/mapreduce/jobs/{jobid}/tasks/{taskid}/attempts- HTTP操作支持
GET- 查询参数支持
NonetaskAttempts对象的元素是taskAttempt(task attempt对象的一个集合)
- 请求实例
在这里就以JSON response为例
HTTP Request:
GET http://<history server http address:port>/ws/v1/history/mapreduce/jobs/job_1326381300833_2_2/tasks/task_1326381300833_2_2_m_0/attemptsResponse Header:
  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)Response Body:
{
   "taskAttempts" : {
      "taskAttempt" : [
         {
            "assignedContainerId" : "container_1326381300833_0002_01_000002",
            "progress" : 100,
            "elapsedTime" : 2638,
            "state" : "SUCCEEDED",
            "diagnostics" : "",
            "rack" : "/98.139.92.0",
            "nodeHttpAddress" : "host.domain.com:8042",
            "startTime" : 1326381450680,
            "id" : "attempt_1326381300833_2_2_m_0_0",
            "type" : "MAP",
            "finishTime" : 1326381453318
         }
      ]
   }
}Task Attempt API
包含关于作业中特定任务尝试的信息
- URI
* http://<history server http address:port>/ws/v1/history/mapreduce/jobs/{jobid}/tasks/{taskid}/attempts/{attemptid}- HTTP操作支持
GET- 查询参数支持
None- taskAttempt对象下的元素
| 名称 | 描述 | 
|---|---|
| id | task id | 
| rack | The rack | 
| state | 状态包含NEW, UNASSIGNED, ASSIGNED, RUNNING, COMMIT_PENDING, SUCCESS_CONTAINER_CLEANUP, SUCCEEDED, FAIL_CONTAINER_CLEANUP, FAIL_TASK_CLEANUP, FAILED, KILL_CONTAINER_CLEANUP, KILL_TASK_CLEANUP, KILLED | 
| type | task的类型 | 
| assignedContainerId | 分配给此尝试的容器id | 
| nodeHttpAddress | 此任务尝试运行的节点的http地址 | 
| diagnostics | 诊断信息 | 
| progress | 进度 | 
| startTime | 任务尝试开始的时间(MS单位) | 
| finishTime | 任务尝试结束的时间(MS单位) | 
| elapsedTime | 任务尝试启动后经过的时间(MS单位) | 
对于减少任务尝试,您还可以使用以下字段:
| 名称 | 描述 | 
|---|---|
| shuffleFinishTime | 洗牌结束的时间(MS单位) | 
| mergeFinishTime | 合并结束的时间(MS单位) | 
| elapsedShuffleTime | 洗牌阶段完成所需的时间(reduce任务开始到洗牌结束之间的ms时间) | 
| elapsedMergeTime | 完成合并阶段所需的时间(洗牌完成和合并完成之间的ms时间) | 
| elapsedReduceTime | 完成reduce阶段所需的时间(从merge finish到reduce任务结束之间的ms时间) | 
- 请求实例
在这里就以JSON response为例
HTTP Request:
GET http://<history server http address:port>/ws/v1/history/mapreduce/jobs/job_1326381300833_2_2/tasks/task_1326381300833_2_2_m_0/attempts/attempt_1326381300833_2_2_m_0_0Response Header:
  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)Response Body:
{
   "taskAttempt" : {
      "assignedContainerId" : "container_1326381300833_0002_01_000002",
      "progress" : 100,
      "elapsedTime" : 2638,
      "state" : "SUCCEEDED",
      "diagnostics" : "",
      "rack" : "/98.139.92.0",
      "nodeHttpAddress" : "host.domain.com:8042",
      "startTime" : 1326381450680,
      "id" : "attempt_1326381300833_2_2_m_0_0",
      "type" : "MAP",
      "finishTime" : 1326381453318
   }
}
Task Attempt Counters API
通过该API,您可以对象表示该任务尝试的所有计数器的资源集合
- URI
 * http://<history server http address:port>/ws/v1/history/mapreduce/jobs/{jobid}/tasks/{taskid}/attempts/{attemptid}/counters- HTTP操作支持
GET- 查询参数支持
NonejobTaskAttemptCounters对象下的元素有 id(task attempt的id)、taskAttemptcounterGroup(taskAttemptCounterGroup的一个集合)
taskAttemptCounterGroup对象下元素有counterGroupName(counterGroup的名字)、counter(counter对象的一个集合)
counter对象下元素有name、vualue
- 请求实例
在这里就以JSON response为例
HTTP Request:
 GET http://<history server http address:port>/ws/v1/history/mapreduce/jobs/job_1326381300833_2_2/tasks/task_1326381300833_2_2_m_0/attempts/attempt_1326381300833_2_2_m_0_0/countersResponse Header:
  HTTP/1.1 200 OK
  Content-Type: application/json
  Transfer-Encoding: chunked
  Server: Jetty(6.1.26)Response Body:
{
   "jobTaskAttemptCounters" : {
      "taskAttemptCounterGroup" : [
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.FileSystemCounter",
            "counter" : [
               {
                  "value" : 2363,
                  "name" : "FILE_BYTES_READ"
               },
               {
                  "value" : 54372,
                  "name" : "FILE_BYTES_WRITTEN"
               },
               {
                  "value" : 0,
                  "name" : "FILE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "FILE_LARGE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "FILE_WRITE_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_BYTES_READ"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_BYTES_WRITTEN"
               },
              {
                  "value" : 0,
                  "name" : "HDFS_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_LARGE_READ_OPS"
               },
               {
                  "value" : 0,
                  "name" : "HDFS_WRITE_OPS"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.TaskCounter",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "COMBINE_INPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "COMBINE_OUTPUT_RECORDS"
               },
               {
                  "value" : 460,
                  "name" : "REDUCE_INPUT_GROUPS"
               },
               {
                  "value" : 2235,
                  "name" : "REDUCE_SHUFFLE_BYTES"
               },
               {
                  "value" : 460,
                  "name" : "REDUCE_INPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "REDUCE_OUTPUT_RECORDS"
               },
               {
                  "value" : 0,
                  "name" : "SPILLED_RECORDS"
               },
               {
                  "value" : 1,
                  "name" : "SHUFFLED_MAPS"
               },
               {
                  "value" : 0,
                  "name" : "FAILED_SHUFFLE"
               },
               {
                  "value" : 1,
                  "name" : "MERGED_MAP_OUTPUTS"
               },
               {
                  "value" : 26,
                  "name" : "GC_TIME_MILLIS"
               },
               {
                  "value" : 860,
                  "name" : "CPU_MILLISECONDS"
               },
               {
                  "value" : 107839488,
                  "name" : "PHYSICAL_MEMORY_BYTES"
               },
               {
                  "value" : 1123147776,
                  "name" : "VIRTUAL_MEMORY_BYTES"
               },
               {
                  "value" : 57475072,
                  "name" : "COMMITTED_HEAP_BYTES"
               }
            ]
         },
         {
            "counterGroupName" : "Shuffle Errors",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "BAD_ID"
               },
               {
                  "value" : 0,
                  "name" : "CONNECTION"
               },
               {
                  "value" : 0,
                  "name" : "IO_ERROR"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_LENGTH"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_MAP"
               },
               {
                  "value" : 0,
                  "name" : "WRONG_REDUCE"
               }
            ]
         },
         {
            "counterGroupName" : "org.apache.hadoop.mapreduce.lib.output.FileOutputFormatCounter",
            "counter" : [
               {
                  "value" : 0,
                  "name" : "BYTES_WRITTEN"
               }
            ]
         }
      ],
      "id" : "attempt_1326381300833_2_2_m_0_0"
   }
}
没有评论