Transformer注解及PyTorch实现 原文:http://nlp.seas.harvard.edu/2018/04/03/attention.html 作者:Alexander Rush 转载自机器之心:https://www.jiqizhixin.com/articles/2018-11-06-10?from=synced&keyword=transformer 在学习的过程中,将代码及排版整理了一下,方便阅读. "Attention is All You Need"
# coding:utf-8 import numpy as np import matplotlib.pyplot as plt def dis(x, y): #计算距离 return np.sum(np.power(y - x, 2)) def dataN(length,k):#生成数据 z=range(k) c=[5]*length a1= [np.sin(i*2*np.pi/k) for i in range(k)] a2= [np.cos(i*2*np.pi/k) for i in r