Notice
Recent Posts
Recent Comments
Link
| 일 | 월 | 화 | 수 | 목 | 금 | 토 |
|---|---|---|---|---|---|---|
| 1 | 2 | |||||
| 3 | 4 | 5 | 6 | 7 | 8 | 9 |
| 10 | 11 | 12 | 13 | 14 | 15 | 16 |
| 17 | 18 | 19 | 20 | 21 | 22 | 23 |
| 24 | 25 | 26 | 27 | 28 | 29 | 30 |
| 31 |
Tags
- prompt engineering
- ChatGPT
- 부스트캠프
- 머신러닝
- Linear Regression
- 코테
- machinelearning
- GPT
- LLM
- 일기
- Python
- transformer
- attention
- 프로그래머스
- 코딩테스트
- 파이썬
- 알고리즘
- Linear Model
- gradient descent
- 기계학습
- Django
- deque
- Deeplearning
- LeetCode
- NLP
- BFS
- Programmers
- 프롬프트
- dl
- rnn
Archives
- Today
- Total
목록dl (2)
크크루쿠쿠
논문 주소: https://arxiv.org/abs/1905.11946 EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks Convolutional Neural Networks (ConvNets) are commonly developed at a fixed resource budget, and then scaled up for better accuracy if more resources are available. In this paper, we systematically study model scaling and identify that carefully balancing n arxiv.org Abstract CNN 은 fix..
AI/논문 리뷰
2021. 8. 10. 01:48