Command and Programming Reference Sample Programs

The cpr subdirectory contains sample program files as described in Chapter 7 of the Command and Programming Reference.

Compute descriptive statistics by year

Computes descriptive statistics by year using the statsby view of a series. The program creates a year identifier series and computes the statistics for each value of the identifier. descr1.prg displays statistics by year in tables.
' tabulate descriptive statistics by year
' revised for version 4.0 (10/26/2000 h)

' change path to program path
%path = @runpath
cd "{%path}"

' get workfile
%evworkfile = "..\data\basics"
load "{%evworkfile}"

' set sample
smpl 1990:1 @last

' create a series containing the year identifier
series year = @year

' compute statistics for each year and
' freeze the output from each of the tables
for %var ip urate m1 tb10
	%name = "tab" + %var
	freeze(%name){%var}.statby(min,max,mean,med) year
	show {%name}
next
descr1.prg saves the values in a table. You may instead wish to place the annual values in a vector or matrix so that they can be used in other calculations. desc2.prg creates a vector to hold the median values for each year and loops through the calculation of the median for each year.
' store descriptive statistics by year
' revised for version 4.0 (10/26/2000 h)

' change path to program path
%path = @runpath
cd "{%path}"

' get workfile
%evworkfile = "..\data\basics"
load "{%evworkfile}"

' set sample
smpl 1990:1 @last

' create series with the year and find the maximum
series year = @year
!lastyear = @max(year)

' loop over the series names
for %var ip urate m1 tb10

	' create matrix to hold the results
	!numrows = (!lastyear - 1990 + 1) + 1
	matrix(!numrows, 7) mat%var

	' loop over each year and compute values
	for !i = 1990 to !lastyear
			!row = !i - 1990 + 1
			smpl if (year=!i)
			mat%var(!row,1) = !i
			mat%var(!row,2) = @mean({%var})
			mat%var(!row,3) = @med({%var})
			mat%var(!row,4) = @min({%var})
			mat%var(!row,5) = @max({%var})
			mat%var(!row,6) = @stdev({%var}) 
			mat%var(!row,7) = @obs({%var})
	next

	' compute the total values
	smpl 1990:1 @last
	mat%var(!numrows,1) = !i
	mat%var(!numrows,2) = @mean({%var})
	mat%var(!numrows,3) = @med({%var})
	mat%var(!numrows,4) = @min({%var})
	mat%var(!numrows,5) = @max({%var})
	mat%var(!numrows,6) = @stdev({%var})
	mat%var(!numrows,7) = @obs({%var})

	show mat%var
next

Rolling window unit-root (ADF) tests

rollreg.prg runs a set of rolling ADF tests on the series M1 using a moving sample of fixed size. The basic techniques for working with rolling samples may be used in a variety of settings. The program stores and displays the t-statistics together with the asymptotic critical value.
' rolling ADF unit root test
' run in quiet mode for slight speed up
'
' revised for version 4.0 (10/26/2000 h)

' change path to program path
%path = @runpath
cd "{%path}"

' get workfile
%evworkfile = "..\data\basics"
load "{%evworkfile}"

' set sample
smpl @all

' find size of workfile
series _temp = 1
!length = @obs(_temp)
delete _temp 

' set fixed sample size
!ssize = 50

' initialize matrix to store results
matrix(!length-!ssize+1,2) adftstat

' run test regression for each subsample and
' store each ar(1) coefficient
' test includes constant with no lagged first difference
for !i = 1  to  !length-!ssize+1

	' set rolling subsample
	smpl @first+!i-1 @first+!i+!ssize-2
	
	' estimate test regression
	equation temp.ls d(m1) c m1(-1)

	' store t-statistic
	adftstat(!i,1) = temp.@tstat(2)
	
	' 5% critical value from Davidson and MacKinnon,
	' Table 20.1, page 708 
	adftstat(!i,2) = -2.86
next

' plot graph of t-statistic
freeze(graph1) adftstat.line
' set aspect ration and use line pattern
graph1.option size(8,2)	linepat
' set legend text
graph1.setelem(1) legend(ADF t-statistic)
graph1.setelem(2) legend(Asymptotic 5% critical value)
' add text at top of graph
%comment = "ADF t-statistic for window of " + @str(!ssize) + " obs for M1"
graph1.addtext(t) {%comment}
show graph1

Calculate cumulative sums

EViews does not have a built-in function for calculating cumulative sums. You can easily calculate such sums with a few lines of commands. cum_sum.prg illustrates the case when you have missing values in the series.
' three rules to compute cumulative sums with missing values
' revised for version 4.0 (10/20/2000 h)

'create workfile
workfile cumsum u 1 10

'fill data
series x
x.fill 4,2,na,4,1,3,2,na,7,5

'for series without NAs
smpl @first @first
series sum0 = x
smpl @first+1 @last
sum0 = sum0(-1) + x

'ignore NAs
smpl @first @first
series sum1 = @nan(x, 0)
smpl @first+1 @last
sum1 = sum1(-1) + @nan(x, 0)
smpl @all if x = na
sum1 = na

'reset to zero at NAs
smpl @all
series sum2 = x
smpl if sum2(-1)<>na and x<>na
sum2 = sum2(-1) + x

smpl @all
show x sum0 sum1 sum2

Compute lags on a broken sample of observations

EViews propagates missing values in series computation. subset.prg demonstrates a general technique for working with a subset of observations from the workfile as though they were adjacent. The example computes the first difference of a series with missing values.
' time series operation on subsamples
' revised for version 4.0 (10/20/2000 h)

' create workfile
workfile subset u 1 10

' fill data
series x
x.fill 4,2,na,4,1,3,2,na,7,5

' create sample
sample ss if x<>NA and x>=0

' create short x series
smpl @all
vector temp  
stomna(x, temp, ss)
series x_s
mtos(temp, x_s)
!rows = @rows(temp)  'save number of elements
delete temp

' difference short x series
series dx_s = d(x_s)

' map back into long series dx
vector temp
stomna(dx_s, temp)
vector(!rows) temp  'trim to number of elements
series dx
mtos(temp, dx, ss)
delete temp

' display results
show x dx x_s dx_s

Create dummy variables with a loop

make_dum.prg creates observation specific dummy variables using a loop.
' create dummy variables for every observation in sample
' revised for version 4.0 (10/20/2000 h)

'create workfile
workfile dumtest q 1970 1990

'set up start and end dates
%start = "1972:1"	'first observation to dummy
%end = "1979:4"    	'last observation to dummy

'generate dummy variables from 'start' to 'end'
for !i = @dtoo(%start) to @dtoo(%end)
	'string containing observation offset
	%obsstr = @otod(!i)
	'name of dummy
	if (@mid(%obsstr, 5, 1) = ":") then
		%name = "d_" + @left(%obsstr, 4) + "_" + @mid(%obsstr, 6)
	else
		%name = "d_" + %obsstr
	endif
	'generate dummy series
	smpl @all
	series %name = 0
	smpl %obsstr %obsstr
	series %name = 1
next

smpl @all
show d_*

Extract test statistics in a loop

omitted.prg illustrates how to extract a test statistic from a table created by freezing a diagnostic view. Note that the file begins by defining a subroutine and the main program begins after the subroutine.
' extracting test statistics from frozen table
' revised for version 4.0 (10/20/2000 h)

'---------------------------------------------------------------------
'subroutine to test whether the first four lags of an omitted
'variable are jointly significant
'
'		eq1: name of equation object to test
'		  g: name of group object containing a list of series to test
'	results: name of table object to store results
'---------------------------------------------------------------------
subroutine omit_test(equation eq1, group g, table results)
	'number of series in group
	!n = g.@count

	table(!n+5,3) results
	
	'set column width of table
	setcolwidth(results, 1, 12)
	
	setcell(results, 1, 1, "Four lag F-test for omitted variables:", "l")
	setline(results, 2)
	results(3,2) = "F-stat"
	results(3,3) = "Probability"
	setline(results, 4)

	'loop through each series in group to test
	for !i = 1 to !n
		'get series
		series temp_s = g(!i)
		'run test 
		freeze(temp_t) eq1.testadd temp_s(-1) temp_s(-2) temp_s(-3) temp_s(-4)
		'store results in table
		results(!i+4, 1) = g.@seriesname(!i)
		results(!i+4, 2) = temp_t(3, 2)
		results(!i+4, 3) = temp_t(3, 5)
		'clean up
		delete temp_s
		delete temp_t
	next

	setline(results, !n+5)
endsub
'---------------------------------------------------------------------

'main program

'create workfile
workfile demo q 52 96

'change path to program path
%path = @runpath
cd "{%path}"

'read data from .xls file
read(b2) "..\data\demo.xls" 4

'estimate equation
smpl @all
equation eq1.ls dlog(m1) c dlog(m1(-1)) dlog(m1(-2)) dlog(m1(-3)) dlog(m1(-4))

'create group to test for omitted variables
group g1 dlog(gdp) dlog(rs) dlog(pr)

'declare table to store test results
table tab1

'call subroutine
call omit_test(eq1, g1, tab1)

'display results
show tab1

Between group estimation for pooled data

between.prg stores the cross-section specific means of each series in a matrix, creates a new workfile, converts the matrix to series, and runs the between groups regression.
' between group esitmation for pool
' revised for version 4.0 (10/27/2000 h)

'change path to program path
%path = @runpath
cd "{%path}"

' load workfile
load ..\data\pool1

' define pool
smpl @all
pool pool1.add aut bus con cst dep hoa mae mis

' set number of cross-sections
!ncross = pool1.@ncross

' create group with variables
pool1.makegroup(tempgrp) log(ivm?) log(mm?)

' store means of two series in matrix
matrix(!ncross,2) means
series tempser
for !i = 1 to !ncross
	tempser = tempgrp(!i)
	means(!i,1) = @mean(tempser)
	tempser = tempgrp(!ncross+!i)
	means(!i,2) = @mean(tempser)
next
store(i) means
delete tempgrp tempser

' create new workfile and fetch means matrix
workfile between u 1 !ncross
fetch(i) means

' convert matrix to series
series lc_mean
series ly_mean
group g1 lc_mean ly_mean
mtos(means,g1)

' run between groups regression and cleanup
equation eq_bet.ls lc_mean c ly_mean
show eq_bet

Hausman test for fixed versus random effects

hausman.prg computes the Hausman test statistic for testing the null hypothesis of random effects against the alternative of fixed effects. The program estimates a fixed and random effects model with two slope regressors and stores the estimated coefficients and its covariance matrix. (The Grunfeld data used in the program is transcribed from Greene (1997), Table 15.1.).
' hausman test for fixed versus random effects
' revised for version 4.0 (10/27/2000 h)

'change path to program path
%path = @runpath
cd "{%path}"

' load workfile
load ..\data\grunfeld

' set sample
smpl @all

' estimate fixed effects and store results
pool1.ls(f) log(inv?) log(val?) log(cap?)
vector beta = pool1.@coefs
matrix covar = pool1.@cov 

' keep only slope coefficients
vector b_fixed = @subextract(beta,1,1,2,1)
matrix cov_fixed = @subextract(covar,1,1,2,2)

' estimate random effects and store results
pool1.ls(r) log(inv?) log(val?) log(cap?)
beta = pool1.@coefs
covar = pool1.@cov 

' keep only slope coefficients
vector b_gls = @subextract(beta,2,1,3,1)
matrix cov_gls = @subextract(covar,2,2,3,3)

' compute Hausman test stat
matrix b_diff = b_fixed - b_gls
matrix var_diff = cov_fixed - cov_gls
matrix qform = @transpose(b_diff)*@inverse(var_diff)*b_diff

if qform(1,1)>=0 then
	' set table to store results
	table(4,2) result
	setcolwidth(result,1,20)
	setcell(result,1,1,"Hausman test for fixed versus random effects")
	setline(result,2)

	!df = @rows(b_diff)
	setcell(result,3,1,"chi-sqr(" + @str(!df) + ") = ")
	setcell(result,3,2,qform(1,1))
	setcell(result,4,1,"p-value = ")
	setcell(result,4,2,1-@cchisq(qform(1,1),!df))
	setline(result,5)

	show result
else
	statusline "Quadratic form is negative"
endif

Reformat regression output table

regrun.prg is the main program file that "includes" a subroutine file regtab.prg.
' formatting regression output table
' revised for version 4.0 (10/20/2000 h)

' include subroutine file
include regtab.prg

'change path to program path
%path = @runpath
cd "{%path}"

' load workfile
load ..\data\basics

' declare table to store output
table tab1

' call subroutine 
call regtab(eq1, tab1, 3)

show tab1
regtab.prg is a subroutine that reformats the estimation output display so that the standard errors and t-statistics are enclosed in parentheses ans displayed below the coefficient estimates.
' subroutine to reformat regresion output
' revised for version 4.0 (10/20/2000 h)
'
'	   eq1: name of equation object
'	  tab1: name of table object for output
'	format: digits after decimal to display
'
subroutine regtab(equation eq1, table tab1, scalar format)

' number of estimated parameters
!ncoef =eq1.@ncoef
' number of observations in estimation sample
!obs=eq1.@regobs

' create temporary table with eviews estimation output
freeze(temp_table) eq1.results

' declare a new table
table(1,4) tab1
' format table
setcolwidth(tab1, 1, 19)
setcolwidth(tab1, 2, format+4)
setcolwidth(tab1, 3, 19)
setcolwidth(tab1, 4, format+4)

' get original header information
!line = 1
while temp_table(!line, 2) <> "Coefficient"
	setcell(tab1, !line, 1, temp_table(!line, 1), "l") 
	!line = !line +1
wend

setline(tab1, !line-1)
setcell(tab1, !line, 1, "Variable", "c" )
setcell(tab1, !line, 2, "Estimate","r")
setcell(tab1, !line, 3, "Estimate","r")
setcell(tab1,!line+1,2, "(s.e.)","r")
setcell(tab1,!line+1,3, "(t-stat)","r")
!line = !line + 2
setline(tab1, !line)
!line = !line + 1
!vline = !line 

' fill all of the coefficients and standard errors 
' (or t-statistics)
for !i = 1 to !ncoef
	' get variable name
	%variable = temp_table(!vline-1, 1)
	
	' write coefficients
	setcell(tab1, !line, 1, %variable, "c")
	!vline = !vline + 1

	' write coefficients
	!est=eq1.@coefs(!i)
	setcell(tab1, !line, 2, !est, "r", format )
	setcell(tab1, !line, 3, !est, "r", format )
	!line = !line + 1
	
	' compute statistics
	!se = sqr(eq1.@covariance(!i, !i))
	!tstat = !est/!se
	!tprob = @tdist(!tstat, !obs-!ncoef)
	
	' write standard errors in parenthesis
	setcell(tab1, !line, 2, !se, "r", format )
	%str_se = tab1(!line, 2)
	%str_se = "(" + %str_se + ")"
	tab1(!line, 2) = %str_se

	' write t-statistic output
	setcell(tab1, !line, 3, !tstat, "r", format )
	%str_t = "(" + tab1(!line, 3) + ")"
	' mark if significant
	if !tprob < .01 then
		%str_t = "**" + %str_t
	else 
		if !tprob < .05 then
			%str_t = "*" + %str_t
		endif
	endif
	tab1(!line, 3) = %str_t

	' increment line counter
	!line = !line + 1
next

setline(tab1, !line)
!line = !line + 1

' original results at bottom of table
setcell(tab1, !line, 1,  "R-squared", "l") 
setcell(tab1, !line, 2, eq1.@r2, "r", format )
setcell(tab1, !line, 3,  "  Mean dependent var", "l")
setcell(tab1, !line, 4, eq1.@meandep, "r", format )
!line = !line + 1

setcell(tab1, !line, 1, "Adjusted R-squared", "l")
setcell(tab1, !line, 2, eq1.@rbar2, "r", format )
setcell(tab1, !line, 3, "  S.D. dependent var", "l")
setcell(tab1, !line, 4, eq1.@sddep, "r", format )
!line = !line + 1

setcell(tab1, !line, 1,  "S.E. of regression", "l")
setcell(tab1, !line, 2,  eq1.@se, "r", format )
setcell(tab1, !line, 3,  "  Akaike info criterion", "l")
setcell(tab1, !line, 4, eq1.@aic, "r", format )
!line = !line + 1

setcell(tab1, !line, 1, "Sum squared resid", "l")
setcell(tab1, !line, 2, eq1.@ssr, "r", format )
setcell(tab1, !line, 3, "  Schwarz criterion", "l")
setcell(tab1, !line, 4, eq1.@schwarz, "r", format )
!line = !line + 1

setcell(tab1, !line, 1, "Log likelihood", "l")
setcell(tab1, !line, 2, eq1.@logl, "r",  format )
setcell(tab1, !line, 3, "  F-statistic", "l")
setcell(tab1, !line, 4, eq1.@f, "r", format )
!line = !line + 1

setcell(tab1, !line, 1, "Durbin-Watson stat", "l")
setcell(tab1, !line, 2,  eq1.@dw, "r", format)
setcell(tab1, !line, 3, "  Prob(F-statistic)", "l")
setcell(tab1, !line, 4, @fdist(eq1.@f,(!ncoef-1),(!obs-!ncoef)), "r", format)
!line = !line + 1
setline(tab1, !line)

endsub