Now you are in the subtree of TECHNOLOGY and MARKETS public knowledge tree. 

Attention mechanisms

Attention Is All You Need ... but scarcity of attention drives Internet world

Intro to Attention

https://machinelearningmastery.com/how-does-attention-work-in-encoder-decoder-recurrent-neural-networks/

The basic idea is to read the input structure twice: once to encode the gist and another time (at each step while decoding) to "pay attention" to certain details.

Attention mechanism