Skip to content

Commit

Permalink
snapshot
Browse files Browse the repository at this point in the history
  • Loading branch information
Michael Creel committed May 3, 2024
1 parent 9cdd4fb commit b3cb738
Show file tree
Hide file tree
Showing 6 changed files with 225 additions and 67 deletions.
4 changes: 3 additions & 1 deletion Examples/GMM/TwoStepGMM.m
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
%%
NerloveData = importdata('NerloveData.m');
n = size(NerloveData,1);
y = log(NerloveData(:,2));
Expand All @@ -8,6 +9,7 @@
%% this block shows that start values can be important, even for this simple problem
%% in the past, Matlab would not get the correct answer here, though Octave does. I'm
%% not sure of the current behavior of Matlab
%% Note April 2024: now Octave has copied Matlab's incorrect result! Bug for bug compatibility!
W = eye(k);
thetastart = zeros(k,1);
options = optimset('TolX',1e-15,'TolFun',1e-12);
Expand All @@ -33,7 +35,7 @@
omega = ms'*ms/n;
fprintf('estimated covariance of moments\n');
omega
W = inv(omega);
W = inv(omega);
thetastart = x\y; % this give OLS coefficients
[thetahat, obj] = fminunc(@(theta) moments(theta, y, x)'*W*moments(theta, y, x), thetastart);
fprintf('GMM results, optimal weight matrix\n');
Expand Down
3 changes: 2 additions & 1 deletion Examples/GMM/moments.m
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
% moments for OLS estimation
function [m, ms] = moments(theta, y, x)
e = y - x*theta;
n = size(y,1);
k = size(theta,1);
m = x'*e;
m = x'*e/n;
if nargout > 1
ms = x.*repmat(e,1,k);
end
Expand Down
13 changes: 4 additions & 9 deletions PracticalSummaries/16-GMM.jl
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ mleresults(model, β⁰); # but won't overflow
## GMM
using Econometrics
moments1 = β -> x.*(y - λ(β))
gmmresults(moments1, β⁰);
βhat, objv, V, D, W, convergence = gmmresults(moments1, β⁰);
# note that the GMM estimator is identical to the ML estimator. You should
# be able to prove that result analytically, by showing that the moment conditions
# are the same as the ML score vector. Because this GMM estimator is
Expand All @@ -29,7 +29,7 @@ gmmresults(moments1, β⁰);
## Trying a different GMM estimator, based on E(y)=λ, so E(y/λ)=1.
using Econometrics
moments2 = β -> x.* (y./λ(β) .- 1.0)
gmmresults(moments2, β⁰);
βhat, objv, V, D, W, convergence = gmmresults(moments2, β⁰);


## Try out overidentified GMM estimator, using both sets of moments
Expand All @@ -47,11 +47,6 @@ m = moments3(βhat) # the moment contributions, evaluated at estimate
## let's see how the Hansen-Sargan test can detect
# incorrect moments
η = 0.01 # if this is different from zero, moments4, and, thus moments5, will not be valid
moments4 = β -> x.* (y./λ(β) .- 1.0 .+ η)
moments5 = β -> [moments1(β) moments4(β)]
gmmresults(moments5, β⁰);
moments4 = β -> moments3(β) .+ η
βhat, objv, V, D, W, convergence = gmmresults(moments4, β⁰);

## Yet another version, using conditional mean = conditional variance
using Econometrics, ForwardDiff
moments6 = β -> x.*((y - λ(β)) .^2.0 - λ(β))
βhat, objv, V, D, W, convergence = gmmresults(moments6, β⁰);
Loading

0 comments on commit b3cb738

Please sign in to comment.