Monologg Blog
Home
Archive
About
GitHub
Home
Archive
About
GitHub
Jangwon Park
AI Engineer
Categories
NLP
5
tech
4
Tags
ai-tooling
antigravity
Claude Code
claude-code
codex
context-engineering
electra
exa
finetuning
ide
keybindings
mcp
MCP
multi-agent
nlp
preprocess
pretraining
pytorch
tavily
tokenization
tool-filter-mcp
tpu
transformers
web-search
wordpiece
More
TPU를 이용하여 Electra Pretraining하기
2020-04-20
NLP
/
nlp
/
tpu
/
electra
GCP에서 TPU를 이용하여 ELECTRA 모델을 Pretraining하는 방법. TFRC 신청부터 VM 세팅, TPU 연결, Pretraining 실행까지의 전체 과정을 다룹니다.
425 words
|
2 minutes
1
2