当前位置:网站首页>Labelsmoothing introduction and code implementation
Labelsmoothing introduction and code implementation
2022-07-22 11:16:00 【Trouble coder】
What is? label smoothing?
Label smoothing (Label smoothing
), image L1
、L2
and dropout
equally , It is a regularization method in the field of machine learning , Usually used for classification problems , The purpose is to prevent the model from overconfident prediction during training , Improve the problem of poor generalization ability .
Use label smoothing Purpose
label smoothing
Commonly used for classification tasks , Prevent the model from over fitting in training , Improve the generalization ability of the model .
Use label smoothing
class LabelSmoothingCrossEntropy(nn.Module):
""" Cross Entropy loss with label smoothing. """
def __init__(self, smoothing=0.1):
""" Constructor for the LabelSmoothing module. :param smoothing: label smoothing factor """
super(LabelSmoothingCrossEntropy, self).__init__()
assert 0.0 < smoothing < 1.0
self.smoothing = smoothing
self.confidence = 1. - smoothing
def forward(self, x, target):
""" How to write it 1 """
# logprobs = F.log_softmax(x, dim=-1)
# nll_loss = -logprobs.gather(dim=-1, index=target.unsqueeze(1))
# nll_loss = nll_loss.squeeze(1) # Get the cross entropy loss
# # Note that this should be understood in combination with the formula , At the same time, pay attention to those who predict correctly , Also have a/K, among a Is the smoothing factor ,K Is the number of categories
# smooth_loss = -logprobs.mean(dim=1)
# loss = self.confidence * nll_loss + self.smoothing * smooth_loss
""" How to write it 2 """
y_hat = torch.softmax(x, dim=1)
# here cross_loss and nll_loss Equivalent
cross_loss = self.cross_entropy(y_hat, target)
smooth_loss = -torch.log(y_hat).mean(dim=1)
# smooth_loss You can also use the following method to calculate , Be careful loga + logb = log(ab)
# smooth_loss = -torch.log(torch.prod(y_hat, dim=1)) / y_hat.shape[1]
loss = self.confidence * cross_loss + self.smoothing * smooth_loss
return loss.mean()
def cross_entropy(self, y_hat, y):
return - torch.log(y_hat[range(len(y_hat)), y])
Then the loss function you may just start using is :
lossfunc = nn.CrossEntropyLoss()
You just change it to :
lossfunc = LabelSmoothingCrossEntropy(smoothing=0.1)
You can use label smoothing in your code .
边栏推荐
猜你喜欢
数据分析从0到1----Matplotlib篇
Idea 2022.2 was officially released, and I can't keep up with it!
Robot modeling and 3D simulation based on ROS [physical / mechanical significance]
关于SAP APO RPMCALL 指定生产订单的BOM更新
向量化引擎对HTAP的价值与技术思考
Data analysis from 0 to 1 --- Matplotlib article
File operation - C language
EN 1504-5混凝土结构保护和修理用产品混凝土喷射—CE认证
Data analysis from 0 to 1 --- numpy articles
Example interview - Zeng yuluo: gain experience from lectures
随机推荐
Is it safe to open an account on flush? How to buy REITs fund
Is it safe for Huatai Securities to open an account online, and can VIP accounts be handled
NASA「史上最强超算」投入使用,碾压老超算霸主Pleiades
这家初创公司想看看你的丁丁!他们用AI检测性疾病,保证数据匿名且加密,你愿意吗
C # realize the conversion of Chinese characters to Pinyin
Turn: all leaders must master the "law of opportunity"
易基因 | 简化基因组DNA甲基化测序(RRBS)实验怎么做?
【红队】ATT&CK - Active Scanning(主动扫描)
2022 audio and video technology vane
File operation - C language
美参议院初步通过520亿美元「芯片法案」,她竟乘机「投资炒股」!
路由策略-
《时代》杂志重磅封面:元宇宙时代将改变世界
智能运维场景解析:如何通过异常检测发现业务系统状态异常
学生管理系统(文件版)
[shutter -- basic component] radio switch & Radio & checkbox
Dokcer运行Nacos容器自动退出问题
Robot modeling and 3D simulation based on ROS [physical / mechanical significance]
What is a video content recommendation engine?
关于编写安全的智能合约