首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >在配置预测并在matlab上进行训练后,如何使用神经网络运行预测?

在配置预测并在matlab上进行训练后,如何使用神经网络运行预测?
EN

Stack Overflow用户
提问于 2020-11-04 06:17:09
回答 1查看 190关注 0票数 0

我使用nnstart命令并获得了一个用于配置和训练网络的matlab应用程序。一旦我导入了我的数据并训练了网络,它不会给我提供实际运行时间序列预测的选项。我能做的最好的事情就是生成一个脚本。但脚本似乎并没有包括一个实际进行预测的程序。这是代码。如何运行预测?另外,如何选择激活函数g(x)?

代码语言:javascript
复制
% Solve an Input-Output Time-Series Problem with a Time Delay Neural Network
% Script generated by Neural Time Series app.
% Created 03-Nov-2020 23:33:27
%
% This script assumes these variables are defined:
 %
 %   data - input time series.
 %   data_1 - target time series.

 X = tonndata(data,false,false);
  T = tonndata(data_1,false,false);

  % Choose a Training Function
  %For a list of all training functions type: help nntrain
    % 'trainlm' is usually fastest. 
   % 'trainbr' takes longer but may be better for challenging problems.
    % 'trainscg' uses less memory. Suitable in low memory situations.
    trainFcn = 'trainlm';  % Levenberg-Marquardt backpropagation.

    % Create a Time Delay Network
     inputDelays = 1:2;
     hiddenLayerSize = 10;
     net = timedelaynet(inputDelays,hiddenLayerSize,trainFcn);

     % Choose Input and Output Pre/Post-Processing Functions
     % For a list of all processing functions type: help nnprocess
     net.input.processFcns = {'removeconstantrows','mapminmax'};
     net.output.processFcns = {'removeconstantrows','mapminmax'};

     % Prepare the Data for Training and Simulation
     % The function PREPARETS prepares timeseries data for a particular network,
     % shifting time by the minimum amount to fill input states and layer
     % states. Using PREPARETS allows you to keep your original time series data
       % unchanged, while easily customizing it for networks with differing
     % numbers of delays, with open loop or closed loop feedback modes.
    [x,xi,ai,t] = preparets(net,X,T);

% Setup Division of Data for Training, Validation, Testing
% For a list of all data division functions type: help nndivision
net.divideFcn = 'dividerand';  % Divide data randomly
net.divideMode = 'time';  % Divide up every sample
net.divideParam.trainRatio = 70/100;
net.divideParam.valRatio = 15/100;
net.divideParam.testRatio = 15/100;

% Choose a Performance Function
% For a list of all performance functions type: help nnperformance
net.performFcn = 'mse';  % Mean Squared Error

 % Choose Plot Functions
 % For a list of all plot functions type: help nnplot
 net.plotFcns = {'plotperform','plottrainstate', 'ploterrhist', ...
 'plotregression', 'plotresponse', 'ploterrcorr', 'plotinerrcorr'};

  % Train the Network
   [net,tr] = train(net,x,t,xi,ai);

  % Test the Network
  y = net(x,xi,ai);
  e = gsubtract(t,y);
  performance = perform(net,t,y)

   % Recalculate Training, Validation and Test Performance
   trainTargets = gmultiply(t,tr.trainMask);
   valTargets = gmultiply(t,tr.valMask);
   testTargets = gmultiply(t,tr.testMask);
   trainPerformance = perform(net,trainTargets,y)
  valPerformance = perform(net,valTargets,y)
testPerformance = perform(net,testTargets,y)

% View the Network
view(net)

% Plots
 % Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotresponse(t,y)
%figure, ploterrcorr(e)
%figure, plotinerrcorr(x,e)

 % Step-Ahead Prediction Network
 % For some applications it helps to get the prediction a timestep early.
 % The original network returns predicted y(t+1) at the same time it is
 % given x(t+1). For some applications such as decision making, it would
 % help to have predicted y(t+1) once x(t) is available, but before the
 % actual y(t+1) occurs. The network can be made to return its output a
 % timestep early by removing one delay so that its minimal tap delay is now
  % 0 instead of 1. The new network returns the same outputs as the original
 % network, but outputs are shifted left one timestep.
 nets = removedelay(net);
 nets.name = [net.name ' - Predict One Step Ahead'];
 view(nets)
 [xs,xis,ais,ts] = preparets(nets,X,T);
  ys = nets(xs,xis,ais);
  stepAheadPerformance = perform(nets,ts,ys)

  % Deployment
  % Change the (false) values to (true) to enable the following code blocks.
% See the help for each generation function for more information.
if (false)
% Generate MATLAB function for neural network for application
% deployment in MATLAB scripts or with MATLAB Compiler and Builder
% tools, or simply to examine the calculations your trained neural
% network performs.
genFunction(net,'myNeuralNetworkFunction');
y = myNeuralNetworkFunction(x,xi,ai);
end
if (false)
% Generate a matrix-only MATLAB function for neural network code
% generation with MATLAB Coder tools.
genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
x1 = cell2mat(x(1,:));
xi1 = cell2mat(xi(1,:));
y = myNeuralNetworkFunction(x1,xi1);
end
if (false)
% Generate a Simulink diagram for simulation or deployment with.
% Simulink Coder tools.
gensim(net);
end
 % Solve an Input-Output Time-Series Problem with a Time Delay Neural Network
% Script generated by Neural Time Series app.
% Created 03-Nov-2020 23:33:27
%
% This script assumes these variables are defined:
%
%   data - input time series.
%   data_1 - target time series.

X = tonndata(data,false,false);
T = tonndata(data_1,false,false);

% Choose a Training Function
% For a list of all training functions type: help nntrain
% 'trainlm' is usually fastest.
% 'trainbr' takes longer but may be better for challenging problems.
% 'trainscg' uses less memory. Suitable in low memory situations.
 trainFcn = 'trainlm';  % Levenberg-Marquardt backpropagation.

% Create a Time Delay Network
 inputDelays = 1:2;
 hiddenLayerSize = 10;
net = timedelaynet(inputDelays,hiddenLayerSize,trainFcn);

% Choose Input and Output Pre/Post-Processing Functions
 % For a list of all processing functions type: help nnprocess
 net.input.processFcns = {'removeconstantrows','mapminmax'};
 net.output.processFcns = {'removeconstantrows','mapminmax'};

 % Prepare the Data for Training and Simulation
 % The function PREPARETS prepares timeseries data for a particular network,
 % shifting time by the minimum amount to fill input states and layer
 % states. Using PREPARETS allows you to keep your original time series data
 % unchanged, while easily customizing it for networks with differing
 % numbers of delays, with open loop or closed loop feedback modes.
 [x,xi,ai,t] = preparets(net,X,T);
  
 % Setup Division of Data for Training, Validation, Testing
 % For a list of all data division functions type: help nndivision
 net.divideFcn = 'dividerand';  % Divide data randomly
 net.divideMode = 'time';  % Divide up every sample
 net.divideParam.trainRatio = 70/100;
 net.divideParam.valRatio = 15/100;
 net.divideParam.testRatio = 15/100;

 % Choose a Performance Function
 % For a list of all performance functions type: help nnperformance
 net.performFcn = 'mse';  % Mean Squared Error

 % Choose Plot Functions
 % For a list of all plot functions type: help nnplot
  net.plotFcns = {'plotperform','plottrainstate', 'ploterrhist', ...
 'plotregression', 'plotresponse', 'ploterrcorr', 'plotinerrcorr'};

 % Train the Network 
[net,tr] = train(net,x,t,xi,ai);

% Test the Network
y = net(x,xi,ai);
e = gsubtract(t,y);
performance = perform(net,t,y)

% Recalculate Training, Validation and Test Performance
trainTargets = gmultiply(t,tr.trainMask);
valTargets = gmultiply(t,tr.valMask);
testTargets = gmultiply(t,tr.testMask);
trainPerformance = perform(net,trainTargets,y)
valPerformance = perform(net,valTargets,y)
testPerformance = perform(net,testTargets,y)

% View the Network
view(net)

% Plots
%Uncomment these lines to enable various plots.
%figure, plotperform(tr)
%figure, plottrainstate(tr)
%figure, ploterrhist(e)
%figure, plotregression(t,y)
%figure, plotresponse(t,y)
%figure, ploterrcorr(e)
%figure, plotinerrcorr(x,e)

% Step-Ahead Prediction Network
% For some applications it helps to get the prediction a timestep early.
% The original network returns predicted y(t+1) at the same time it is
% given x(t+1). For some applications such as decision making, it would
% help to have predicted y(t+1) once x(t) is available, but before the 
% actual y(t+1) occurs. The network can be made to return its output a
% timestep early by removing one delay so that its minimal tap delay is now
% 0 instead of 1. The new network returns the same outputs as the original
% network, but outputs are shifted left one timestep.
nets = removedelay(net);
nets.name = [net.name ' - Predict One Step Ahead'];
view(nets)
[xs,xis,ais,ts] = preparets(nets,X,T);
ys = nets(xs,xis,ais);
stepAheadPerformance = perform(nets,ts,ys)

% Deployment
% Change the (false) values to (true) to enable the following code blocks.
% See the help for each generation function for more information.
if (false)
% Generate MATLAB function for neural network for application
% deployment in MATLAB scripts or with MATLAB Compiler and Builder
% tools, or simply to examine the calculations your trained neural
% network performs.
genFunction(net,'myNeuralNetworkFunction');
y = myNeuralNetworkFunction(x,xi,ai);
end
if (false)
% Generate a matrix-only MATLAB function for neural network code
% generation with MATLAB Coder tools.
genFunction(net,'myNeuralNetworkFunction','MatrixOnly','yes');
x1 = cell2mat(x(1,:));
xi1 = cell2mat(xi(1,:));
y = myNeuralNetworkFunction(x1,xi1);
end
if (false)
% Generate a Simulink diagram for simulation or deployment with.
% Simulink Coder tools.
gensim(net);
end
EN

回答 1

Stack Overflow用户

发布于 2020-11-04 16:03:53

对于分类模型,在您的模型-object:Y = predict(Mdl,X)上使用predict

对于回归模型,在模型-object:Y = sim(Mdl,X)上使用sim

与其他语言不同的是,MATLAB没有将所有方法包装到一个类中,而是有一个适用于所有模型的命令(实际上,有两个命令:一个用于分类数据,另一个用于连续预测)。因此,您还可以在SVM (fitcsvm/fitrsvm)或kNN (fitcknn)上使用它们。

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/64671518

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档