没有隐藏层和线性激活函数的神经网络应该近似线性回归吗?

问题描述 投票:0回答:1

据我所知,假设您不使用任何隐藏层和线性激活函数,则神经网络将产生与线性回归相同的方程式。即y = SUM(w_i * x_i + b_i),其中i为您所拥有特征数的0。

我已经尝试通过使用线性回归的权重和偏差向自己证明这一点,将其输入到神经网络中,然后查看结果是否相同。他们不是。

我想知道我的理解是不正确的,还是我的代码是或者也许两者都是。


from sklearn.linear_model import LinearRegression
import tensorflow as tf
from tensorflow import keras
import numpy as np

linearModel = LinearRegression()
linearModel.fit(np.array(normTrainFeaturesDf), np.array(trainLabelsDf))

# Gets the weights of the linear model and the intercept in a form that can be passed into the neural network
linearWeights = np.array(linearModel.coef_)
intercept = np.array([linearModel.intercept_])

trialWeights = np.reshape(linearWeights, (len(linearWeights), 1))
trialWeights = trialWeights.astype('float32')
intercept = intercept.astype('float32')
newTrialWeights = [trialWeights, intercept]

# Create a neural network and set the weights of the model to the linear model
nnModel = keras.Sequential([keras.layers.Dense(1, activation='linear', input_shape=[len(normTrainFeaturesDf.keys())]),])
nnModel.set_weights(newTrialWeights)

# Print predictions of both models (the results are vastly different)
print(linearModel.predict(np.array(normTestFeaturesDf))
print(nnModel.predict(normTestFeaturesDf).flatten())

python tensorflow machine-learning neural-network linear-regression
1个回答
0
投票

是的,具有单层并且没有激活函数的神经网络等效于线性回归。

定义一些您不包括的变量:

normTrainFeaturesDf = np.random.rand(100, 10)
normTestFeaturesDf = np.random.rand(10, 10)
trainLabelsDf = np.random.rand(100)

然后输出是预期的:

[0.46698921 0.73922914 0.48939089 0.6061812  0.6043783  0.45431294
 0.37186223 0.72054087 0.47443313 0.66467065]
[0.46698922 0.73922914 0.4893909  0.6061812  0.6043783  0.45431295
 0.37186223 0.7205409  0.47443312 0.66467065]
© www.soinside.com 2019 - 2024. All rights reserved.