PyTorch Developer Podcast

TensorIterator

Episode Summary

You walk into the whiteboard room to do a technical interview. The interviewer looks you straight in the eye and says, "OK, can you show me how to add the elements of two lists together?" Confused, you write down a simple for loop that iterates through each element and adds them together. Your interviewer rubs his hands together evilly and cackles, "OK, let's make it more complicated." What does TensorIterator do? Why the heck is TensorIterator so complicated? What's going on with broadcasting? Type promotion? Overlap checks? Layout? Dimension coalescing? Parallelization? Vectorization?

Episode Notes

You walk into the whiteboard room to do a technical interview. The interviewer looks you straight in the eye and says, "OK, can you show me how to add the elements of two lists together?" Confused, you write down a simple for loop that iterates through each element and adds them together. Your interviewer rubs his hands together evilly and cackles, "OK, let's make it more complicated."

What does TensorIterator do? Why the heck is TensorIterator so complicated? What's going on with broadcasting? Type promotion? Overlap checks? Layout? Dimension coalescing? Parallelization? Vectorization?

Further reading.