[D] How can I encode positional information of sentences in a text in a BERT model?
I am trying to train a BERT model for a certain form of text classification and I realized that it might be useful to know whether a sentence is on the same line or on a newline in a pdf document. Is there any way to encode the newline distance in BERT? My idea was using a different customized positional encoding but I am not sure whether it is the correct approach, and if it is, what continuous function to use. Would love to hear any suggestions on this.