pyspark.TaskContext.taskAttemptId¶
- 
TaskContext.taskAttemptId() → int[source]¶
- An ID that is unique to this task attempt (within the same - SparkContext, no two task attempts will share the same attempt ID). This is roughly equivalent to Hadoop’s TaskAttemptID.- Returns
- int
- current task attempt id.