nielsr/lilt-xlm-roberta-base

91次阅读

nielsr/lilt-xlm-roberta-base

LiLT + XLM-RoBERTa-base

This model is created by combining the Language-Independent Layout Transformer (LiLT) with XLM-RoBERTa, a multilingual RoBERTa model (trained on 100 languages).
This way, we have a LayoutLM-like model for 100 languages

前往AI网址导航

正文完
 0
微草录
版权声明:本站原创文章,由 微草录 2024-01-02发表,共计217字。
转载说明:除特殊说明外本站文章皆由CC-4.0协议发布,转载请注明出处。