Cite
PathologyBERT -- Pre-trained Vs. A New Transformer Language Model for Pathology Domain
MLA
Santos, Thiago, et al. PathologyBERT -- Pre-Trained Vs. A New Transformer Language Model for Pathology Domain. 2022. EBSCOhost, widgets.ebscohost.com/prod/customlink/proxify/proxify.php?count=1&encode=0&proxy=&find_1=&replace_1=&target=https://search.ebscohost.com/login.aspx?direct=true&site=eds-live&scope=site&db=edsarx&AN=edsarx.2205.06885&authtype=sso&custid=ns315887.
APA
Santos, T., Tariq, A., Das, S., Vayalpati, K., Smith, G. H., Trivedi, H., & Banerjee, I. (2022). PathologyBERT -- Pre-trained Vs. A New Transformer Language Model for Pathology Domain.
Chicago
Santos, Thiago, Amara Tariq, Susmita Das, Kavyasree Vayalpati, Geoffrey H. Smith, Hari Trivedi, and Imon Banerjee. 2022. “PathologyBERT -- Pre-Trained Vs. A New Transformer Language Model for Pathology Domain.” http://widgets.ebscohost.com/prod/customlink/proxify/proxify.php?count=1&encode=0&proxy=&find_1=&replace_1=&target=https://search.ebscohost.com/login.aspx?direct=true&site=eds-live&scope=site&db=edsarx&AN=edsarx.2205.06885&authtype=sso&custid=ns315887.