![](https://cmegali.myblog.arts.ac.uk/wp-content/plugins/lazy-load/images/1x1.trans.gif)
I started the task by matching the screen on the phone using the Transform node and modifying the rotate, scale, and skew parameters.
![](https://cmegali.myblog.arts.ac.uk/wp-content/plugins/lazy-load/images/1x1.trans.gif)
![](https://cmegali.myblog.arts.ac.uk/wp-content/plugins/lazy-load/images/1x1.trans.gif)
Afterward, I attached a Tracker node to the original footage and applied 4 points. In order to make the iPhone screen follow the camera movement, I selected all 4 points and created a CornerPin2D (current frame, baked). Eventually, a CornerPin2D will appear and once it is gonna be connected to Merge, the screen is gonna be tracked.
![](https://cmegali.myblog.arts.ac.uk/wp-content/plugins/lazy-load/images/1x1.trans.gif)
![](https://cmegali.myblog.arts.ac.uk/wp-content/plugins/lazy-load/images/1x1.trans.gif)
![](https://cmegali.myblog.arts.ac.uk/wp-content/plugins/lazy-load/images/1x1.trans.gif)
Finally, I rotoscoped each finger, adding a bit of feather in order to avoid green parts each time the fingers slide over the screen
![](https://cmegali.myblog.arts.ac.uk/wp-content/plugins/lazy-load/images/1x1.trans.gif)