推荐系统 |
您所在的位置:网站首页 › 人口统计学角度 › 推荐系统 |
以固定Q,求解P为例; 由于每一个用户u都是相互独立的,当Q固定时,用户特征向量 P u P_u Pu应该取得的值与其它用户特征向量无关,所以每一个 P u P_u Pu都可以单独求解; 优化目标 min P , Q C \displaystyle\min_{P,Q}C P,QminC可转化为 min P [ ∑ u , i ( R u i − P u ⋅ Q i ) 2 + λ ∑ u ∥ P u ∥ 2 ] = ∑ u min P [ ∑ i ( R u i − P u ⋅ Q i ) 2 + λ ∥ P u ∥ 2 ] \displaystyle\min_P[\displaystyle\sum_{u,i}(R_{ui}-P_u\cdot Q_i)^2+\lambda\displaystyle\sum_{u}{\lVert P_u\rVert}^2]=\displaystyle\sum_{u}\displaystyle\min_P[\displaystyle\sum_{i}(R_{ui}-P_u\cdot Q_i)^2+\lambda{\lVert P_u\rVert}^2] Pmin[u,i∑(Rui−Pu⋅Qi)2+λu∑∥Pu∥2]=u∑Pmin[i∑(Rui−Pu⋅Qi)2+λ∥Pu∥2] 令 L u ( P u ) = ∑ i ( R u i − P u ⋅ Q i ) 2 + λ ∥ P u ∥ 2 L_u(P_u)=\displaystyle\sum_{i}(R_{ui}-P_u\cdot Q_i)^2+\lambda{\lVert P_u\rVert}^2 Lu(Pu)=i∑(Rui−Pu⋅Qi)2+λ∥Pu∥2 那么我们的目标变成了:求每一个用户特征向量 P − u P-u P−u,使得 L u ( P u ) L_u(P_u) Lu(Pu)取得最小值 L u L_u Lu对 P u P_u Pu求偏导: ð L u ð P u = ð [ ∑ i ( R u i − P u ⋅ Q i ) 2 + λ ∥ P u ∥ 2 ] ð P u = ∑ i 2 ( P u Q i − R u i ) Q i + 2 λ P u T = 2 ( ∑ i P u Q i Q i − ∑ i R u i Q i + λ P u T ) \frac{\eth L_u}{\eth P_u} =\frac{\eth[\displaystyle\sum_{i}(R_{ui}-P_u\cdot Q_i)^2+\lambda{\lVert P_u\rVert}^2]}{\eth P_u}=\displaystyle\sum_{i}2(P_uQ_i-R_{ui})Q_i+2\lambda P_u^T =2(\displaystyle\sum_{i}P_uQ_iQ_i-\displaystyle\sum_{i}R_{ui}Q_i+\lambda P_u^T) ðPuðLu=ðPuð[i∑(Rui−Pu⋅Qi)2+λ∥Pu∥2]=i∑2(PuQi−Rui)Qi+2λPuT=2(i∑PuQiQi−i∑RuiQi+λPuT) 令 ð L u ð P u = 0 \frac{\eth L_u}{\eth P_u}=0 ðPuðLu=0,可以得到 ∑ i P u Q i Q i − ∑ i R u i Q i + λ P u T = 0 \displaystyle\sum_{i}P_uQ_iQ_i-\displaystyle\sum_{i}R_{ui}Q_i+\lambda P_u^T =0 i∑PuQiQi−i∑RuiQi+λPuT=0 P u Q i = ∑ i P u k Q k i = Q i T P u T P_uQ_i =\displaystyle\sum_{i}P_{uk}Q_{ki} =Q_i^TP_u^T PuQi=i∑PukQki=QiTPuT 由于 P u P_u Pu和 Q i Q_i Qi是向量 所以有 ∑ i Q i T P u T Q i − ∑ i R u i Q i + λ P u T = 0 \displaystyle\sum_{i}Q_i^TP_u^TQ_i-\displaystyle\sum_{i}R_{ui}Q_i+\lambda P_u^T =0 i∑QiTPuTQi−i∑RuiQi+λPuT=0 ( ∑ i Q i Q I T + λ I ) P u T = ∑ i R u i Q i (\displaystyle\sum_{i}Q_iQ_I^T+\lambda I)P_u^T =\displaystyle\sum_{i}R_{ui}Q_i (i∑QiQIT+λI)PuT=i∑RuiQi 其中, I I I是单位矩阵 Q i Q_i Qi是物品i的特征向量,所以等式可以写为: [ ( Q 1 , Q 2 , ⋯ , Q n ) ( Q 1 T , Q 2 T , ⋯ , Q n T ) T + λ I ] P u T = ( Q 1 , Q 2 , ⋯ , Q n ) ( R u 1 , R u 2 , ⋯ , R u n ) T [(Q_1,Q_2,\cdots,Q_n)(Q_1^T,Q_2^T,\cdots,Q_n^T)^T+\lambda I]P_u^T =(Q_1,Q_2,\cdots,Q_n)(R_{u1},R_{u2},\cdots,R_{un})^T [(Q1,Q2,⋯,Qn)(Q1T,Q2T,⋯,QnT)T+λI]PuT=(Q1,Q2,⋯,Qn)(Ru1,Ru2,⋯,Run)T 可以看到,物品特征向量构成的矩阵,其实就是Q: ( Q 1 , Q 2 , ⋯ , Q n ) = Q (Q_1,Q_2,\cdots,Q_n)=Q (Q1,Q2,⋯,Qn)=Q 所以可以得到: ( Q Q T + λ I ) P n T = Q R u (QQ^T+\lambda I)P_n^T =QR_u (QQT+λI)PnT=QRu 解得: P n T = ( Q Q T + λ I ) − 1 Q R u P_n^T =(QQ^T+\lambda I)^{-1}QR_u PnT=(QQT+λI)−1QRu 这样一一解出每一个 P u P_u Pu,就可以得到这一步的用户特征矩阵P了 同样,下一次固定P时,可以解得: Q i T = ( P P T + λ I ) − 1 P R i Q_i^T =(PP^T+\lambda I)^{-1}PR_i QiT=(PPT+λI)−1PRi |
CopyRight 2018-2019 办公设备维修网 版权所有 豫ICP备15022753号-3 |