Hugging Face/NLP Course

HF-NLP-USING 🤗 TRANSFORMERS : Basic usage completed!

솔웅 2023. 12. 25. 12:42
반응형

https://huggingface.co/learn/nlp-course/chapter2/7?fw=pt

 

Basic usage completed! - Hugging Face NLP Course

2. Using 🤗 Transformers 3. Fine-tuning a pretrained model 4. Sharing models and tokenizers 5. The 🤗 Datasets library 6. The 🤗 Tokenizers library 9. Building and sharing demos new

huggingface.co

 

Basic usage completed!

 

Great job following the course up to here! To recap, in this chapter you:

 

여기까지의 과정을 잘 따라오셨습니다! 요약하면 이 장에서는 다음을 수행합니다.

 

  • Learned the basic building blocks of a Transformer model.
  • Transformer 모델의 기본 구성 요소를 학습했습니다.
  • Learned what makes up a tokenization pipeline.
  • 토큰화 파이프라인을 구성하는 요소를 알아봤습니다.
  • Saw how to use a Transformer model in practice.
  • 실제로 Transformer 모델을 사용하는 방법을 살펴보았습니다.
  • Learned how to leverage a tokenizer to convert text to tensors that are understandable by the model.
  • 토크나이저를 활용하여 텍스트를 모델에서 이해할 수 있는 텐서로 변환하는 방법을 배웠습니다.
  • Set up a tokenizer and a model together to get from text to predictions.
  • 텍스트에서 예측을 얻으려면 토크나이저와 모델을 함께 설정하세요.
  • Learned the limitations of input IDs, and learned about attention masks.
  • 입력 ID의 제한 사항을 알아보고 주의 마스크에 대해 알아봤습니다.
  • Played around with versatile and configurable tokenizer methods.
  • 다양하고 구성 가능한 토크나이저 방법을 사용해 보았습니다.

 

 

From now on, you should be able to freely navigate the 🤗 Transformers docs: the vocabulary will sound familiar, and you’ve already seen the methods that you’ll use the majority of the time.

 

이제부터는 🤗 Transformers 문서를 자유롭게 탐색할 수 있을 것입니다. 어휘는 친숙하게 들릴 것이며 대부분의 시간에 사용하게 될 방법은 이미 살펴보셨을 것입니다.

 

 

 

반응형