This is such a circular thing, that I feel like it is amazing to see it.
The reason LLMs use a NN is because they're trying to encode a probability function for generating the passage.
And now, you are encoding another n-gram follower exercise (i.e 1+1 = 2) on top of it :)