tag:blogger.com,1999:blog-3253970043297274052024-03-12T20:26:07.296-04:00Beyond Microfoundations:One PhD student's musings about economics, politics, and mountaineering...David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.comBlogger302125tag:blogger.com,1999:blog-325397004329727405.post-30410930712042409222013-02-17T04:27:00.000-05:002013-02-17T04:27:09.830-05:00Dynamic programming with credit constraintsI am looking for <i>simple</i> examples of economic models with occasionally binding credit constraints. I would like to find the most straightforward example possible, and then bludgeon it into submission with my various numerical algorithms...suggestions are much appreciated!David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com6tag:blogger.com,1999:blog-325397004329727405.post-21661467132612941662013-02-13T18:15:00.002-05:002013-02-13T18:17:01.503-05:00Solving a deterministic RBC modelTaking a short break from marking undergraduate economic essays and decided to write a bit of Python <a href="https://github.com/davidrpugh/computational-econ-labs/blob/master/Dynamic-Programming-Lab/deterministic-RBC.py" target="_blank">code</a> to solve a deterministic RBC model using <a href="http://www.wouterdenhaan.com/numerical/VFIslides.pdf" target="_blank">value function iteration</a>. Code to replicate the result can be found here. Below are plots of the optimal policy functions (I included some of the iterates of the policy functions as well). <br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/computational-econ-labs/master/Dynamic-Programming-Lab/Graphics/RBC-value-policy-iterates.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="132" src="https://raw.github.com/davidrpugh/computational-econ-labs/master/Dynamic-Programming-Lab/Graphics/RBC-value-policy-iterates.png" width="400" /></a></div>
<br />
Again the code is mind-numbingly slow (possibly due to the <a href="http://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.interp1d.html" target="_blank">interpolation scheme</a> I am currently using) and takes roughly 8-10 minutes to finish. Any suggestions for speeding up the code (perhaps by using fancy indexing to avoid the for loop!) would greatly appreciated! David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com1tag:blogger.com,1999:blog-325397004329727405.post-90287117937234383962013-02-13T11:43:00.002-05:002013-02-13T11:45:06.645-05:00Assaulting the Ramsey model (numerically!)Everything (and then some!) that you would ever want to know about using dynamic programming techniques to solve deterministic and stochastic versions of the Ramsey optimal growth model can be found in this <a href="http://www.wiwi.uni-augsburg.de/vwl/maussner/lehrstuhl/pap/hm_value3(18Dec09final).pdf" target="_blank">paper</a>. <br />
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
I wrote up a quick implementation of the most basic version of the value function iteration described in the paper (vanilla value iteration with a good initial guess and cubic spline interpolation). Below is a graphic I produced of the optimal value and policy functions as well as every 50th iterate (to give a sense of the convergence properties).<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/computational-econ-labs/master/Dynamic-Programming-Lab/Graphics/Ramsey-value-policy-iterates.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://raw.github.com/davidrpugh/computational-econ-labs/master/Dynamic-Programming-Lab/Graphics/Ramsey-value-policy-iterates.png" width="400" /></a></div>
<div>
The Python <a href="https://github.com/davidrpugh/computational-econ-labs/blob/master/Dynamic-Programming-Lab/deterministic-ramsey.py" target="_blank">code</a> is slowish (takes several minutes to compute the above functions). Suggestions on ways to speed up the code are definitely welcome!</div>
<div>
<br /></div>
<div>
Back to the grind of marking essays...enjoy!</div>
David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-81205638746734038942013-01-26T01:38:00.001-05:002013-01-26T01:38:30.720-05:00Technology and the Solow residual...I recently wrote some <a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">Python code</a> to compute paths of technology and the implied <a href="http://en.wikipedia.org/wiki/Solow_residual" target="_blank">Solow residuals</a> using data from the <a href="https://pwt.sas.upenn.edu/" target="_blank">Penn World Tables</a>. Combining the results with some country metadata (i.e., income groupings) from the <a href="http://oliversherouse.github.com/wbdata/" target="_blank">World Bank API</a> yields this pretty interesting graphic...<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-26-Solow-Residual-by-Income-Group.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-26-Solow-Residual-by-Income-Group.png" width="400" /></a></div>
If you don't already have the Penn World Tables data...no worries! The script will download the PWT data, compute the Solow residuals based on a method used by <a href="http://www.jstor.org/stable/2586948" target="_blank">Hall and Jones (1999)</a>. Based on this decomposition, high income (i.e., red) countries had higher levels of technology in 1960 and higher subsequent growth rates of technology. In fact, the low income (i.e., purple) countries have had effectively zero technological progress since 1960!<br />
<br />
Enjoy!<br />
David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com2tag:blogger.com,1999:blog-325397004329727405.post-34744363134667456622013-01-24T11:11:00.002-05:002013-02-26T16:42:15.644-05:00Solving models with heterogenous agents...<br />
This material was previously part of another <a href="http://beyondmicrofoundations.blogspot.co.uk/2013/01/welfare-costs-of-business-cycles-and.html" target="_blank">post</a> on welfare costs of business cycles. I decided that the links on solving Krusell-Smith deserved its own post!<br />
<br />
Place to start is definitely Wouter den Haan's <a href="http://www.wouterdenhaan.com/notes.htm" target="_blank">website</a>, specifically his <a href="http://www.wouterdenhaan.com/numerical/introhetero.pdf" target="_blank">introductory slides</a> on heterogenous agent models. <a href="http://www.econ.yale.edu/smith/paper15.pdf" target="_blank">Krusell and Smith (2006)</a> is an excellent literature review on heterogenous agent models.<br />
<div>
<br /></div>
<div>
<b>Various solution algorithms: </b>Key difficulty in solving Krusell-Smith type models is to find a way to summarize the cross-sectional distribution of capital and employment status (a potentially infinite dimensional object) with a limited set of its moments. </div>
<div>
<ul>
<li>Original <a href="http://www.wouterdenhaan.com/numerical/methodsheteroks.pdf" target="_blank">KS algorithm</a>: The KS algorithm specifies a law of motion for these moments and then finds the approximating function to this law of motion using a <a href="http://www.wouterdenhaan.com/numerical/simulationslides.pdf" target="_blank">simulation</a> procedure. Specifically, for a <i>given</i> a set of individual policy rules (which can be solved for via value function iterations, etc), a time series of cross-sectional moments is generated and new laws of motion for the aggregate moments are estimated using this simulated data. </li>
<li><a href="http://www.wouterdenhaan.com/numerical/methodsheteroxpa.pdf" target="_blank">Xpa algorithm</a> </li>
<li><a href="http://www.wouterdenhaan.com/numerical/methodshetero.pdf" target="_blank">Other algorithms</a></li>
</ul>
</div>
<b>General issues with implementing Krusell-Smith algorithm:</b><br />
<br />
<i><a href="http://www.robertkollmann.com/DEN_HAAN_MODEL_COMPARISON_JEDC_2010.pdf" target="_blank">Den Haan (2010):</a></i><br />
<blockquote class="tr_bq">
This paper compares numerical solutions to the model of Krusell and Smith [1998. Income and wealth heterogeneity in the macroeconomy. Journal of Political Economy 106, 867–896] generated by different algorithms. The algorithms have very similar implications for the correlations between different variables. Larger differences are observed for (i) the unconditional means and standard deviations of individual variables, (ii) the behavior of individual agents during particularly bad times, (iii) the volatility of the per capita capital stock, and (iv) the behavior of the higher-order moments of the cross-sectional distribution. For example, the two algorithms that differ the most from each other generate individual consumption series that have an average (maximum) difference of 1.63% (11.4%).</blockquote>
Concise description of Krusell and Smith (1998) model can be found in the <a href="http://www.robertkollmann.com/DEN_HAAN_JUDD_JUILLARD_INTRODUCTION_JEDC_2010.pdf" target="_blank">introduction</a> to the JEDC Krusell-Smith comparison project. It would seem the JEDC Krusell-Smith comparison project is a good place to start thinking about how to implement the Krusell-Smith algorithm in Python. The above paper suggests that implementation details matter and can substantially impact the accuracy of solution. <a href="http://www.wouterdenhaan.com/software.htm" target="_blank">Wouter den Haan</a> provides <a href="http://www.wouterdenhaan.com/datasuite.htm" target="_blank">code</a> for JECD comparison project. Maybe start by implementing the <a href="http://wouterdenhaan.com/papers/expa.pdf" target="_blank">den Haan and Rendahl (2009)</a> algorithm in Python? Followed by <a href="http://elaine.ihs.ac.at/~mreiter/hetjedc.pdf" target="_blank">Reiter (2009)</a>. Given that different algorithms can arrive at different solutions, checking the accuracy of a given solution method against alternatives is important. Slides on the <a href="http://www.wouterdenhaan.com/numerical/accuracyslides.pdf" target="_blank">checking accuracy</a> of the above algorithms.<br />
<br />
<b>Other Krusell-Smith related links:</b><br />
<ul>
<li>Anthony Smith's original <a href="http://www.econ.yale.edu/smith/code.htm" target="_blank">Fortran</a> implementation of Krusell-Smith. Code looks completely impenetrable! Not much in the way of comments or documentation. Probably not worth working out how the code works.</li>
<li>Sergei Mailar's <a href="http://www.stanford.edu/~maliars/Files/Codes.html" target="_blank">MatLab code</a> for Krusell-Smith (Mailar provides code for other interesting projects as well!).</li>
<li>Fortran <a href="http://www.econ.yale.edu/smith/kmssprograms.zip" target="_blank">code</a> for <a href="http://www.econ.yale.edu/smith/sdarticle1.pdf" target="_blank">Krusell and Smith (2009)</a>.</li>
<li>Wouter den Haan's slides on <a href="http://www.wouterdenhaan.com/numerical/methodshetero_applications.pdf" target="_blank">additional applications</a> of the Krusell-Smith approach (includes monetary models with consumer heterogeneity, models with entrepreneurs, turning KS into a matching model, portfolio problem).</li>
</ul>
David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-48996887462186859792013-01-17T20:51:00.001-05:002013-01-20T03:53:12.399-05:00Python code to grab Penn World Tables dataFor any interested parties, I wrote a small python <a href="https://github.com/davidrpugh/pyeconomics/blob/master/sandbox/pwt.py" target="_blank">script</a> to download the <a href="https://pwt.sas.upenn.edu/" target="_blank">Penn World Tables</a> dataset and convert it into a <a href="http://pandas.pydata.org/pandas-docs/dev/dsintro.html#panel" target="_blank">Pandas Panel</a>. Code passed all of my tests, but can't claim that it is industrial strength.<br />
<br />
To test out the code, I put together a quick set of layered histograms for global real GDP per capita growth rates from 1951 to 2010. Result...<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/pyeconomics/master/sandbox/Histogram-historical-growth-rates-RGDPPC.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://raw.github.com/davidrpugh/pyeconomics/master/sandbox/Histogram-historical-growth-rates-RGDPPC.png" width="400" /></a></div>
The number of countries for which data is available varies from around 50 in 1951 to roughly 190 in 2010. Perhaps this means I should have normalized the above histogram?David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-55961237868335530862013-01-12T06:52:00.000-05:002013-01-24T11:11:10.382-05:00Welfare costs of business cycles and models with heterogenous agents...Start of quasi literature review for a heterogenous agents project I am starting in the near future. I will continue to update this post as I come across/finish reading additional papers...comments or links to important paper are welcome!<br />
<br />
<b>Literature:</b><br />
<b><br /></b>
<i><a href="http://pages.stern.nyu.edu/~dbackus/Taxes/Lucas%20priorities%20AER%2003.pdf" target="_blank">Lucas (2003):</a> </i>Robert Lucas' Presidential Address given at the 2003 AEA conference in which he summarizes and defends his back-of-the-envelope calculation of the welfare costs of business cycles. Good, gentle introduction to the literature. Includes a good reference list and brief discussion of the the Krusell and Smith (2002) working paper.<br />
<br />
<i><a href="http://www.nber.org/papers/w10926" target="_blank">Barlevy (2004):</a></i><br />
<blockquote class="tr_bq">
This article reviews the literature on the cost of U.S. post-War business cycle fluctuations. I argue that recent work has established this cost is considerably larger than initial work found. However, despite the large cost of macroeconomic volatility, it is not obvious that policymakers should have pursued a more aggressive stabilization policy than they did. Still, the fact that volatility is so costly suggests stable growth is a desirable goal that ought to be maintained to the extent possible, just as policymakers are currently required to do under the Balanced Growth and Full Employment Act of 1978. This survey was prepared for the Economic Perspectives, a publication of the Federal Reserve Bank of Chicago.</blockquote>
As boring an abstract as you will ever come across. Includes a nice table summarizing various estimates of the cost of business cycles.<br />
<br />
<i><a href="http://www.econ.yale.edu/smith/250034.pdf" target="_blank">Krusell and Smith (1998):</a></i><br />
<blockquote class="tr_bq">
How do movements in the distribution of income and wealth affect the macroeconomy? We analyze this question using a calibrated version of the stochastic growth model with partially uninsurable idiosyncratic risk and movements in aggregate productivity. Our main finding is that, in the stationary stochastic equilibrium, the behavior of the macroeconomic aggregates can be almost perfectly described using only the mean of the wealth distribution. This result is robust to substantial changes in both parameter values and model specification. Our benchmark model, whose only difference from the representative-agent framework is the existence of uninsurable idiosyncratic risk, displays far less cross-sectional dispersion and skewness in wealth than U.S. data. However, an extension that relies on a small amount of heterogeneity in thrift does succeed in replicating the key features of the wealth data. Furthermore, this extension features aggregate time series that depart significantly from permanent income behavior.</blockquote>
<i><a href="http://www.econ.yale.edu/smith/sdarticle10.pdf" target="_blank">Krusell and Smith (1999):</a></i><br />
<blockquote class="tr_bq">
We investigate the welfare effects of eliminating business cycles in a model with substantial consumer heterogeneity. The heterogeneity arises from uninsurable and idiosyncratic uncertainty in preferences and employment, where, regarding employment, we distinguish among employment and short- and long-term unemployment. We calibrate the model to match the distribution of wealth in U.S. data and features of transitions between employment and unemployment. Unlike previous studies, we study how business cycles affect different groups of consumers. We conclude that the cost of cycles is small for almost all groups and, indeed, is negative for some.</blockquote>
<i><a href="http://www.brown.edu/Departments/Economics/Papers/2004/2004-08_paper.pdf" target="_blank">Krebs (2004):</a></i><br />
<blockquote class="tr_bq">
This paper analyzes the welfare costs of business cycles when workers face uninsurable idiosyncratic labor income risk. In accordance with the previous literature, this paper decomposes labor income risk into an aggregate and an idiosyncratic component, but in contrast to the previous literature, this paper allows for multiple sources of idiosyncratic labor income risk. Using the multi-dimensional approach to idiosyncratic risk, this paper provides a general characterization of the welfare cost of business cycles when preferences and the (marginal) process of individual labor income in the economy with business cycles are given. The general analysis shows that the introduction of multiple sources of idiosyncratic risk never decreases the welfare cost of business cycles, and strictly increases it if there are cyclical fluctuations across the different sources of risk. Finally, this paper also provides a quantitative analysis of multi-dimensional labor income risk based on a version of the model that is calibrated to match U.S. labor market data. The quantitative analysis suggests that realistic variations across two particular dimensions of idiosyncratic labor income risk increase the welfare cost of business cycles by a substantial amount.</blockquote>
<div class="p1">
<i><a href="http://www.princeton.edu/wwseconpapers/papers/dp235.pdf" target="_blank">Schulholfer-Wohl (2008):</a></i></div>
<blockquote class="tr_bq">
I study the welfare cost of business cycles in a complete-markets economy where some people are more risk averse than others. Relatively more risk-averse people buy insurance against aggregate risk, and relatively less risk-averse people sell insurance. These trades reduce the welfare cost of business cycles for everyone. Indeed, the least risk-averse people benefit from business cycles. Moreover, even infinitely risk-averse people suffer only finite and, in my empirical estimates, very small welfare losses. In other words, when there are complete insurance markets, aggregate fluctuations in consumption are essentially irrelevant not just for the average person—the surprising finding of Lucas [Lucas, Jr., R.E., 1987. Models of Business Cycles. Basil Blackwell, New York] but for everyone in the economy, no matter how risk averse they are. If business cycles matter, it is because they affect productivity or interact with uninsured idiosyncratic risk, not because aggregate risk <i>per se </i>reduces welfare. </blockquote>
<i><a href="http://www.econ.yale.edu/smith/revisit.pdf" target="_blank">Krusell et al. (2009):</a></i><br />
<blockquote class="tr_bq">
We investigate the welfare effects of eliminating business cycles in a model with substantial consumer heterogeneity. The heterogeneity arises from uninsurable and idiosyncratic uncertainty in preferences and employment status. We calibrate the model to match the distribution of wealth in U.S. data and features of transitions between employment and unemployment. In comparison with much of the literature, we find rather large effects. For our benchmark model, we find welfare effects that, on average across all consumers, are of a bit more than one order of magnitude larger than those computed by Lucas [Lucas Jr., R.E., 1987. Models of Business Cycles. Basil Blackwell, New York]. When we distinguish long- from short-term unemployment, long-term unemployment being distinguished by poor (and highly procyclical) employment prospects and low unemployment compensation, the average gain from eliminating cycles is as much as 1% in consumption equivalents. In addition, in both models, there are large differences across groups: very poor consumers gain a lot when cycles are removed (the long-term unemployed as much as around 30%), as do very rich consumers, whereas the majority of consumers—the “middle class”—sees much smaller gains from removing cycles. Inequality also rises substantially upon removing cycles.</blockquote>
The above paper has a <a href="http://www.econ.yale.edu/smith/revisit.pdf" target="_blank">2002 working paper</a> that seems to come to different conclusions about the welfare costs of business cycles. Technical <a href="http://www.econ.yale.edu/smith/kmssappx.pdf" target="_blank">appendices</a> are also provided.<br />
<br />
<ul>
</ul>
David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com2tag:blogger.com,1999:blog-325397004329727405.post-30326222718544698342013-01-03T18:32:00.000-05:002013-01-03T18:32:06.754-05:00How well do you know your utility function?<br />
This is an excerpt from my teaching notes for an upcoming computational economics lab on the RBC model that I am teaching at the University of Edinburgh that I thought might be of more general interest (mostly because of the cool graphics!)...<br />
<br />
In the basic RBC model from Chapter 5 of David Romer's <i><a href="http://www.amazon.co.uk/Advanced-Macroeconomics-McGraw-Hill-Economics-David/dp/0073511374" target="_blank">Advanced Macroeconomics</a></i>, the representative household has the following single period utility function: $$u(C_{t}, l_{t}) = ln(C_{t}) + b\ ln(1 - l_{t})$$ where $C_{t}$ is per capita consumption, $l_{t}$ is labor (note that labor endowment has been normalized to 1!), and $b$ is a parameter (just a weight that the household places on utility from leisure relative to utility from consumption).<br />
<br />
First a 3D plot of the utility surface...<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-03-RBC-Utility-Surface.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-03-RBC-Utility-Surface.png" width="400" /></a></div>
Followed by a nice contour plot showing the indifference curves for the agent...<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-03-RBC-Contour-Plot-Utility-Surface.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-03-RBC-Contour-Plot-Utility-Surface.png" width="400" /></a></div>
Suppose that the representative household lives for two periods and that there is no uncertainty about future prices. Because of logarithmic preferences, the household will follow the decision rule 'consume a fraction fixed fraction of the PDV of lifetime net worth.' We can derive this decision rule formally as follows. First, note that the household budget constraint is $$C_{0} + \frac{1}{1 + r_{1}}C_{1} = w_{0}l_{0} + \frac{1}{1 + r_{1}}w_{1}l_{1}$$ where $r_{1}$ is the real interest rate. The Lagrangian for the household's two period optimization problem is $$\max_{\{C_{t}\}, \{l_{t}\}} ln(C_{0}) + b\ ln(1-l_{0}) + \beta[ln(C_{1}) + b\ ln(1-l_{1})] + \lambda\left[w_{0}l_{0} + \frac{1}{1 + r_{1}}w_{1}l_{1} - C_{0} - \frac{1}{1 + r_{1}}C_{1}\right]$$ The household now chooses sequences of consumption and labor (i.e., representative household chooses $C_{0}, C_{1}, l_{0}, l_{1}$). The FOC along with the budget constraint imply a system of 5 equations in the 5 unknowns $C_{0}, C_{1}, l_{0}, l_{1}, \lambda$ as follows: $$\begin{align}\frac{1}{C_{0}} - \lambda =& 0 \\ \beta\frac{1}{C_{1}} - \lambda \frac{1}{1+r_{1}}=& 0 \\-\frac{b}{1 - l_{0}} + \lambda w_{0} =& 0 \\-\beta\frac{b}{1 - l_{1}} + \lambda \frac{1}{1 + r_{1}}w_{1} =& 0 \\C_{0} + \frac{1}{1 + r_{1}}C_{1} =& w_{0}l_{0} + \frac{1}{1 + r_{1}}w_{1}l_{1}\end{align}$$ This 5 equation system can be reduced (by eliminating the Lagrange multiplier $\lambda$) to a linear system of 4 equations in 4 unknowns: $$\begin{vmatrix} b & \ 0 & \ w_{0} & 0 \\\ \beta(1 + r_{1}) & -1 & 0 & 0 \\\ 0 & \ b & 0 & w_{1} \\\ 1 & \frac{1}{1 + r_{1}} & -w_{0} & -\frac{1}{1+r_{1}}w_{1} \end{vmatrix} \begin{vmatrix}C_{0} \\\ C_{1} \\\ l_{0} \\\ l_{1}\end{vmatrix} = \begin{vmatrix} w_{0} \\\ 0 \\\ w_{1} \\\ 0\end{vmatrix}$$ The above system can be solved in closde form using some method like Cramer's rule/substitution etc to yield the following optimal sequences/policies for consumption and labor supply: $$\begin{align}C_{0} =& \frac{1}{(1 + b)(1 + \beta)}\left(w_{0} + \frac{1}{1 + r_{1}}w_{1}\right) \\ C_{1} =& \left(\frac{1 + r_{1}}{1 + b}\right)\left(\frac{\beta}{1 + \beta}\right)\left(w_{0} + \frac{1}{1 + r_{1}}w_{1}\right) \\ l_{0} =& 1 - \left(\frac{b}{w_{0}}\right)\left(\frac{1}{(1 + b)(1 + \beta)}\right)\left(w_{0} + \frac{1}{1 + r_{1}}w_{1}\right) \\ l_{1} =& 1 - \left(\frac{b\beta(1+r_{1})}{w_{1}}\right) \left(\frac{1}{(1 + b)(1 + \beta)}\right)\left(w_{0} + \frac{1}{1 + r_{1}}w_{1}\right) \end{align}$$<br />
Several important points to note about the above optimal consumption and labor supply policies:<br />
<br />
<ul>
<li>Household's lifetime net worth, $w_{0} + \frac{1}{1 + r_{1}}w_{1}$, is the present discounted value of its labor endowment.</li>
<li>Household's lifetime net worth depends on the wages in BOTH periods and future interest rate. It hints at the more general result that, if the household has an infinite time horizon, lifetime net worth depends on the entire future path of wages and interest rates.</li>
<li>In each period, household's consume a fraction of their lifetime net worth. Although the fraction changes in this simple two period model, if the household has an infinite horizon, the fraction of lifetime net worth consumed each period will be fixed and equal to $$\frac{1}{(1 + b)(1 + \beta + \beta^2 + \dots)}=\frac{1 - \beta}{1 + b}$$</li>
<li>From the policy function for $l_{0}$, one can show that in order for the labor supply in period $t=0$ to be non-negative (which it must!), the following inequality must hold: $$\left(\frac{1}{1 + r_{1}}\right)\left(\frac{w_{1}}{w_{0}}\right) \lt \frac{(1 + b)(1 + \beta)}{b} - 1$$</li>
<li>From the policy function for $l_{1}$, in order for the labor supply in period $t=1$ to be non-negative (which it must!), the following inequality must hold: $$(1 + r_{1})\left(\frac{w_{0}}{w_{1}}\right) \lt \left(\frac{1 + b}{b}\right)\left(\frac{1 + \beta}{\beta}\right) - 1$$</li>
</ul>
<br />
If we specify some prices (i.e., wages in period $t=0, 1$, $w_{0}=5,w_{1}=9$ and the interest rate $r_{1}=0.025$), then we can graphically represent the optimal choices of consumption and labor supply in period $t=0$ and $t=1$ as follows.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-03-Optimal-Bundle-Period-0.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-03-Optimal-Bundle-Period-0.png" width="400" /></a></div>
Note that with the wage in $t=1$ being almost twice as high as the wage in period $t=0$, the agent pushes his labor supply in period $t=0$, $l_{0}$, almost all the way to zero (i.e., he chooses not to work very much). The agent can still consume because, absent any frictions (things like borrowing constraints, incomplete markets, imperfect contracts, etc), he can easily consume some of his future labor earnings in period $t=1$ in the current period.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-03-Optimal-Bundle-Period-1.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-03-Optimal-Bundle-Period-1.png" width="400" /></a></div>
In period $t=1$, although the higher wage causes the agent to significantly increases his labor supply, there is no much change in his level of consumption (i.e., there is consumption smoothing!).<br />
<br />
As always, code is available on <a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">GitHub</a>.<br />
David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-28355826480945865472013-01-03T16:06:00.000-05:002013-01-03T16:06:48.745-05:00Python, IPython, and EmacsFor a long time now I have been meaning to move to Emacs fulltime. Today I decided to bite the bullet and dive into setting up Python, IPython, and Emacs on my MacBook. Process was surprisingly painless. Hat tip to Jess Hamrick for this very <a href="http://www.jesshamrick.com/2012/09/18/emacs-as-a-python-ide/" target="_blank">detailed post</a> that helped get me up and running.David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-79138558539824779752013-01-02T16:27:00.000-05:002013-01-02T16:27:51.763-05:00Graph of the DayA busy day (actually trying to do a bit of my own research!)...so I just threw together a plot of the historical civilian unemployment rate using FRED data (similar to figure 1-3 from Mankiw's intermediate macroeconomics textbook). Very boring I know, but tomorrow I promise something a bit more interesting!<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-02-Mankiw-Fig-1-3.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-02-Mankiw-Fig-1-3.png" width="400" /></a></div>
<br />
If anyone can point me in the direction of the actual data that Mankiw uses to generate the graphs from his textbook I would be very grateful. I can't seem to find it! Code for the above is available on <a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">GitHub</a>.<br />
David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-34186543294646546742013-01-01T18:42:00.000-05:002013-01-02T07:34:40.319-05:00Graph of the DayA New Year and a new graph of the day! This graphic actually uses a new Python library, <a href="http://oliversherouse.github.com/wbdata/" target="_blank">wbdata</a>, for grabbing World Bank data via the World Bank's API. Here is a plot of global inflation over the last 50 odd years for all available countries. I have color-coded the countries according to income group: Low, Lower-Middle, Upper-Middle, or High.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-01-Global-Inflation-by-Income-Groups.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2013-01-01-Global-Inflation-by-Income-Groups.png" width="400" /></a></div>
I am not entirely thrilled with this graph. It turned out to be hard to scale the y-axis to capture the full range of the data: Democratic Republic of Congo had an annual inflation rate over 23,000% in 1994! Zimbabwe <i>would</i> have had even higher annual inflation rates but they stopped reporting inflation statistics in 2006 (just prior to the onset of its recent bought of <a href="http://en.wikipedia.org/wiki/Hyperinflation_in_Zimbabwe" target="_blank">hyperinflation)</a>.<br />
<br />
As always, code is available on <a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">GitHub</a>.David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-78141852293969245462012-12-31T07:18:00.001-05:002012-12-31T07:19:13.697-05:00Graph of the DayToday's graphic is motivated by <a href="http://krugman.blogs.nytimes.com/2012/12/28/policy-implications-of-capital-biased-technology-opening-remarks/" target="_blank">recent</a> <a href="http://krugman.blogs.nytimes.com/2012/12/27/future-inequality-according-to-the-cbo/" target="_blank">posts</a> by Paul Krugman on implications of capital-biased technological change. In both posts Krugman uses the share of employee compensation (COE) to nominal GDP as his measure of labor's share of income. Although the data for both series go back to 1947, Krugman chooses to drop the data prior to 1973 arguing that 1973 marked the end of the post-WWII economic boom. Put another way, Krugman is saying that there is a structural break in the data generating process for labor's share which makes data prior to 1973 useless (or perhaps actively misleading) if one is interested in thinking about future trends in labor share.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-31-Krugmans-Plot.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-31-Krugmans-Plot.png" width="400" /></a></div>
<br />
If you are wondering what a plot of the entire time series looks like here is the ratio of COE / GDP from 1947 forward.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-31-Declining-COE-Share-of-GDP.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-31-Declining-COE-Share-of-GDP.png" width="400" /></a></div>
It looks like the employee compensation ratio is roughly the same today as it was in 1950 (although obviously heading in different directions!). <br />
<br />
In his first <a href="http://krugman.blogs.nytimes.com/2012/12/27/future-inequality-according-to-the-cbo/" target="_blank">post</a> Krugman argues that this measure "fluctuates over the business cycle." Note that the vertical scale ranges only from 0.52 to 0.60. Such a small range will exacerbate fluctuations in the series. Plotting the same data on its natural scale (i.e., 0 to 1), yields the following.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-31-Constant-COE-Share-of-GDP.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-31-Constant-COE-Share-of-GDP.png" width="400" /></a></div>
Based on this plot, the measure appears to have been <a href="http://en.wikipedia.org/wiki/Kaldor's_facts" target="_blank">remarkably constant</a> over the past 60 odd years.<br />
<br />
Which of these plots gives the more "correct" view of the data? Or does it depend on the point you are trying to make?<br />
<br />
As always, code is <a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">available</a>.<br />
<br />David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-40350913705122649602012-12-28T20:30:00.000-05:002012-12-28T20:30:25.115-05:00Graph of the DayTook a few days off blogging for Christmas and Boxing Day, but am now back at it! Here is a quick plot of historical measures of inflation in the U.S.. I used Pandas to grab the three price indices, and then used a nice built-in Pandas method <span style="font-family: Courier New, Courier, monospace;">pct_change(periods)</span><span style="font-family: Times, Times New Roman, serif;">to convert the monthly price indices (i.e., CPIAUCNS and CPIAUCSL) and</span><span style="font-family: Times, Times New Roman, serif;"> the quarterly GDP deflator to measures of percentage change in prices from a year ago (which is a standard measure of inflation). </span><br />
<span style="font-family: Times, Times New Roman, serif;"><br /></span>
<span style="font-family: Times, Times New Roman, serif;">After combining the three series into a single DataFrame object, you can plot all three series with a single line of code!</span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-28-Mankiw-Figure-1-2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-28-Mankiw-Figure-1-2.png" width="400" /></a></div>
<span style="font-family: Times, 'Times New Roman', serif;">Unsurprisingly the three measures track one another very closely. Perhaps I should have thrown in some measures of producer prices? Code is available <a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">here</a>.</span>David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-79489183358238625572012-12-24T07:18:00.001-05:002012-12-24T07:18:47.320-05:00Graph(s) of the Day!<span style="font-family: Times, Times New Roman, serif;"><span style="background-color: white; line-height: 17px;">Today's graphic(s) attempt to dispel a common misunderstanding of basic probability theory. We all know that flipping a fair coin will result in heads exactly 50% of the time. Given this, many people seem to think that the <a href="http://en.wikipedia.org/wiki/Law_of_large_numbers" target="_blank">Law of Large Numbers (LLN)</a> tells us</span><span style="background-color: white; line-height: 17px;"> that the <i>observed</i> <i>number</i> of heads should more or less equal the <i>expected number </i>of heads. This intuition is wrong!</span></span><br />
<span style="background-color: white; line-height: 17px;"><span style="font-family: Times, Times New Roman, serif;"><br /></span></span>
<span style="background-color: white; line-height: 17px;"><span style="font-family: Times, Times New Roman, serif;">A South African mathematician named <a href="http://en.wikipedia.org/wiki/John_Edmund_Kerrich" target="_blank">John Kerrich</a> was visiting Copenhagen in 1940 when Germany invaded Denmark. Kerrich spent the next five years in an interment camp where, to pass the time, he carried out a series of experiments in probability theory...including an experiment where he flipped a coin by hand 10,000 times! He apparently also used ping-pong balls to demonstrate Bayes theorem.</span></span><br />
<span style="background-color: white; line-height: 17px;"><span style="font-family: Times, Times New Roman, serif;"><br /></span></span>
<span style="font-family: Times, Times New Roman, serif;"><span style="background-color: white; line-height: 17px;">After the war Kerrich was released and published the results of many of his experiments. I have copied the table of the coin flipping results reported by Kerrich below (and included a csv file on GitHub). The first two collumns are self explanatory, the third column, <b>Difference</b>, </span><span style="background-color: white; line-height: 17px;">is the difference between the <i>observed</i> number of heads and the <i>expected</i> number of heads.</span></span><br />
<div style="text-align: center;">
<table align="center" border="0" style="background-color: white; border-collapse: collapse; border-spacing: 0px; border: 1px solid black; color: black; font-size: 14px; font: inherit; line-height: 17px; margin: 1em 2em; padding: 0px; text-align: start; vertical-align: baseline;"><thead style="border: 0px; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;">
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="border: 0px; font-size: 14px; font: inherit; margin: 0px; padding: 0px; text-decoration: underline; vertical-align: baseline;"><strong style="border: 0px; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: Times, Times New Roman, serif;">Tosses</span></strong></span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="border: 0px; font-size: 14px; font: inherit; margin: 0px; padding: 0px; text-decoration: underline; vertical-align: baseline;"><strong style="border: 0px; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: Times, Times New Roman, serif;">Heads</span></strong></span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="border: 0px; font-size: 14px; font: inherit; margin: 0px; padding: 0px; text-decoration: underline; vertical-align: baseline;"><strong style="border: 0px; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><span style="font-family: Times, Times New Roman, serif;">Difference</span></strong></span></td></tr>
</thead><tbody style="border: 0px; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;">
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">10</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">4</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">-1</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">20</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">10</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">0</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">30</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">17</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">2</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">40</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">21</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">1</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">50</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">25</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">0</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">60</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">29</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">-1</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">70</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">32</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">-3</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">80</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">35</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">-5</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">90</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">40</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">-5</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">100</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">44</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">-6</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">200</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">98</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">-2</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">300</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">146</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">-4</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">400</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">199</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">-1</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">500</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">255</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">5</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">600</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">312</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">12</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">700</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">368</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">18</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">800</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">413</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">13</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">900</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">458</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">8</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">1000</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">502</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">2</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">2000</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">1013</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">13</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">3000</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">1510</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">10</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">4000</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">2029</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">29</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">5000</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">2533</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">33</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">6000</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">3009</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">9</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">7000</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">3516</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">16</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">8000</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">4034</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">34</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">9000</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">4538</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">38</span></td></tr>
<tr align="center" style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 0px; vertical-align: baseline;"><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">10000</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">5067</span></td><td style="border: 1px solid black; font-size: 14px; font: inherit; margin: 0px; padding: 4px; text-align: left; vertical-align: middle;"><span style="font-family: Times, Times New Roman, serif;">67</span></td></tr>
</tbody></table>
</div>
<span style="font-family: Times, Times New Roman, serif;">Below I plot the data in the third column: the difference between the observed number of heads and the expected number of heads is diverging</span><span style="background-color: white; font-family: Times, 'Times New Roman', serif; line-height: 17px;"> (which is the exact opposite of most peoples' intuition)!</span><span style="background-color: white; font-family: Times, 'Times New Roman', serif; line-height: 17px;"> </span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-24-Kerrich-difference-btw-observed-expected-heads.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-24-Kerrich-difference-btw-observed-expected-heads.png" width="400" /></a></div>
<span style="font-family: Times, Times New Roman, serif;"><br /></span><span style="background-color: white; line-height: 17px;"><span style="font-family: Times, Times New Roman, serif;">Perhaps Kerrich made a mistake (he didn't), but we can check his results via simulation! First, a single replication of T = 10,000 flips of a fair coin...</span></span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-24-Replication-of-Kerrich-experiment.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-24-Replication-of-Kerrich-experiment.png" width="400" /></a></div>
<span style="background-color: white; line-height: 17px;"><span style="font-family: Times, Times New Roman, serif;"><br /></span></span>
<span style="background-color: white; line-height: 17px;"><span style="font-family: Times, Times New Roman, serif;">Again, we observe divergence (but this time in the opposite direction!). For good measure, I ran N=100 replications of the same experiment (i.e., flipping a coin T=10,000 times). The result is the following nice graphic...</span></span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-24-Simulation-of-Differences.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-24-Simulation-of-Differences.png" width="400" /></a></div>
<span style="background-color: white; line-height: 17px;"><span style="font-family: Times, Times New Roman, serif;"><br /></span></span><span style="background-color: white; line-height: 17px;"><span style="font-family: Times, Times New Roman, serif;">Our simulations suggest that Kerrich's result was indeed typical. </span></span><span style="font-family: Times, Times New Roman, serif;"><span style="background-color: white; line-height: 17px;">The LLN does</span><span style="background-color: white; line-height: 17px;"> </span><i style="border: 0px; font: inherit; line-height: 17px; margin: 0px; padding: 0px; vertical-align: baseline;">not</i><span style="background-color: white; line-height: 17px;"> </span><span style="background-color: white; line-height: 17px;">say that as T increases the <i>observed number</i> of heads will be close to the <i>expected number</i> of heads! What the LLN says instead is that, as T increases, the <i>average number</i> of heads will get closer and closer to the true population average (which in this case, with our fair coin, is 0.5). </span></span><br />
<div style="background-color: white; border: 0px; font: inherit; line-height: 17px; margin-top: 1em; padding: 0px; vertical-align: baseline;">
<span style="font-family: Times, Times New Roman, serif;">Let's run another simulation to verify that the LLN actually holds. In the experiment I conduct N=100 runs of T=10,000 coin flips. For each of the runs I re-compute the sample average after each successive flip.</span></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-24-Demonstration-of-LLN.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-24-Demonstration-of-LLN.png" width="400" /></a></div>
<div style="text-align: left;">
As always <a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">code and data</a> are available! Enjoy.</div>
David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-31698907517494050122012-12-23T07:20:00.001-05:002012-12-23T07:20:01.901-05:00Graph of the DayEarlier this week I used Pandas to grab some historical data on the S&P 500 from Yahoo!Finance and generate a simple time series plot. Today, I am going to re-examine this data set in order to show the importance of scaling and adjusting for inflation when plotting economic data.<br />
<br />
I again use the functions from the <span style="font-family: Courier New, Courier, monospace;">pandas.io.data</span><span style="font-family: Times, Times New Roman, serif;"> to grab the data. Specifically, I use </span><span style="font-family: Courier New, Courier, monospace;">get_data_yahoo('^GSPC')</span><span style="font-family: Times, Times New Roman, serif;"> to get the S&P 500 time series, and </span><span style="font-family: Courier New, Courier, monospace;">get_data_fred('CPIAUCSL')</span><span style="font-family: Times, Times New Roman, serif;">to grab the consumer price index (CPI). Here is a naive plot of historical S&P 500 returns from 1950 through 2012 (as usual, includes grey NBER recession bands).</span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-23-Nominal%20SP500%20(Monthly).png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-23-Nominal%20SP500%20(Monthly).png" width="400" /></a></div>
<span style="font-family: Times, 'Times New Roman', serif;">Note that, because the CPI data are monthly frequency, I resample the daily S&P 500 data by taking monthly averages. </span><span style="font-family: Times, Times New Roman, serif;">This plot might make you conclude that there was a massive structural break/regime change around the year 2000 </span><span style="font-family: Times, 'Times New Roman', serif;">in whatever underlying process is generating the S&P 500. However, </span><span style="font-family: Times, 'Times New Roman', serif;">as the level of the S&P 500 increases, </span><span style="font-family: Times, 'Times New Roman', serif;">the linear scale on the vertical axis makes changes from month to month seem more dramatic. To control for this, I simply make the vertical scale logarithmic (now equal distances on the vertical axis represent equal percentage changes in the S&P 500).</span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-23-Nominal%20SP500%20(Monthly,%20log-scale).png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-23-Nominal%20SP500%20(Monthly,%20log-scale).png" width="400" /></a></div>
<span style="font-family: Times, Times New Roman, serif;">Now the "obvious" structural break in the year 2000 no longer seems so obvious. Indeed there was a period of roughly 10-15 years during the late 1960's through 1970's during which the S&P 500 basically moved sideways in a similar manner to what we have experienced during the last 10+ years. </span><br />
<span style="font-family: Times, Times New Roman, serif;"><br /></span>
<span style="font-family: Times, Times New Roman, serif;">This brings us to another, more significant, problem with these graphs: neither or them adjusts for inflation! When plotting long economic time series it is always a good idea to adjust for inflation. The 1970's was a period of fairly high inflation in the U.S., thus the fact that the nominal value of the S&P 500 didn't change all that much over this period tells us that, in real terms, the value of the S&P 500 fell considerably. </span><br />
<span style="font-family: Times, Times New Roman, serif;"><br /></span>
<span style="font-family: Times, Times New Roman, serif;">Using the CPI data from FRED, it is straight-forward to convert the nominal value of the S&P 500 index to a real value for some base month/year. Below is a plot of the real S&P 500 (in Nov. 2012 Dollars), with a logarithmic scale on the vertical axis. As expected, the real S&P 500 declined significantly during the 1970's.</span><br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-23-Real%20SP500%20(Monthly,%20log-scale).png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-23-Real%20SP500%20(Monthly,%20log-scale).png" width="400" /></a></div>
<span style="font-family: Times, Times New Roman, serif;"><br /></span>
<span style="font-family: Times, Times New Roman, serif;"><a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">Code</a> is available on GitHub. Enjoy!</span><br />
<br />David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-17929501014806279232012-12-22T05:59:00.001-05:002013-01-25T12:26:25.613-05:00Graph(s) of the DaySuppose large number of identical firms in a perfectly competitive industry with constant returns to scale (CRTS) Cobb-Douglas production functions: \[Y = F(K, L) = K^{\alpha}(AL)^{1 - \alpha}\] Output, Y, is a homogenous of degree one function of capital, K, labor, L, and technology, A, is <i>labor augmenting.</i><br />
<br />
Typically, we economists model firms as choosing demands for capital and labor in order to maximize profits while taking prices as given (i.e., unaffected by the decisions of the individual firm):\[\max_{K,L} \Pi = K^{\alpha}(AL)^{1 - \alpha} - (wL + rK)\] where the prices are $1, w, r$. Note that I am following convention in assuming that the price of the output good is the <i>numeraire</i> (i.e., normalized to 1) and thus the real wage, $w$, and the return to capital, $r$, are both relative prices expressed in terms of units of the output good.<br />
<br />
The first order conditions (FOCs) of a typical firms maximization problem are \[\begin{align}\frac{\partial \Pi}{\partial K}=&0 \implies r = \alpha K^{\alpha-1}(AL)^{1 - \alpha} \label{MPK}\\<br />
\frac{\partial \Pi}{\partial L}=&0 \implies w = (1 - \alpha) K^{\alpha}(AL)^{-\alpha}A \label{MPL}\end{align}\] Dividing $\ref{MPK}$ by $\ref{MPL}$ (and a bit of algebra) yields the following equation for the optimal capital/labor ratio: \[\frac{K}{L} = \left(\frac{\alpha}{1 - \alpha}\right)\left(\frac{w}{r}\right)\] The fact that, for a given set of prices $w$, $r$, the optimal choices of $K$ and $L$ are indeterminate (any ratio of $K$ and $L$ satisfying the above condition will do) implies that the optimal scale of the firm is also indeterminate.<br />
<br />
How can I create a graphic that clearly demonstrates this property of the CRTS production function? I can start by fixing values for the wage and return to capital and then creating contour plots of the production frontier and the cost surface.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-22-Contour-plots-for-output-and-costs.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-22-Contour-plots-for-output-and-costs.png" width="400" /></a></div>
The above contour plots are drawn for $w\approx0.84$ and $r\approx0.21$ (which implies an optimal capital/labor ratio of 2:1). You should recognize the contour plot for the production surface (left) from a previous post. The contour plot of the cost surface (right) is a simple plane (which is why the isocost lines are lines and not curves!). Combining the contour plots allows one to see the set of tangency points between isoquants and isocosts.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-22-Firm-size-indeterminate-with-CRTS.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-22-Firm-size-indeterminate-with-CRTS.png" width="400" /></a></div>
<br />
A firm manager is indifferent between each of these points of tangency, and thus the size/scale of the firm is indeterminate. Indeed, with CRTS a firm will earn zero profits at each of the tangency points in the above contour plot.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-22-Zero-profits-with-CRTS.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-22-Zero-profits-with-CRTS.png" width="400" /></a></div>
<br />
<br />
As usual, the <a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">code</a> is available on GitHub.<br />
<br />
<i>Update: </i>Installing <a href="http://www.mathjax.org/" target="_blank">MathJax</a> on my blog to render mathematical equations was <a href="http://irrep.blogspot.co.uk/2011/07/mathjax-in-blogger-ii.html" target="_blank">easy</a> (just a quick cut and paste job).David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-29094255611375580352012-12-21T07:29:00.000-05:002012-12-23T06:53:42.033-05:00Gun control... Via <a href="http://economistsview.typepad.com/economistsview/2012/12/what-do-economists-know-about-guns.html" target="_blank">Mark Thoma</a>, <a href="http://newmonetarism.blogspot.co.uk/2012/12/guns.html" target="_blank">Steve Williamson</a> has an excellent post about the economics of gun control:<br />
<blockquote class="tr_bq">
<div class="post-body entry-content" id="post-body-2519040346260759288" itemprop="description articleBody" style="background-color: white; color: #333333; font-family: Georgia, serif; font-size: 13px; line-height: 1.6em; margin: 0px 0px 0.75em;">
What's the problem here? People buy guns for three reasons: (i) they want to shoot animals with them; (ii) they want to shoot people with them; (iii) they want to threaten people with them. There are externalities. Gun manufacturers and retailers profit from the sale of guns. The people who buy the guns and use them seem to enjoy having them. But there are third parties who suffer. People shooting at animals can hit people. People who buy guns intending to protect themselves may shoot people who in fact intend no harm. People may temporarily feel compelled to harm others, and want an efficient instrument to do it with.<br />
<br />
There are also information problems. It may be difficult to determine who is a hunter, who is temporarily not in their right mind, and who wants to put a loaded weapon in the bedside table.<br />
<br />
What do economists know? We know something about information problems, and we know something about mitigating externalities. Let's think first about the information problems. Here, we know that we can make some headway by regulating the market so that it becomes segmented, with these different types of people self-selecting. This one is pretty obvious, and is a standard part of the conversation. Guns for hunting do not need to be automatic or semi-automatic, they do not need to have large magazines, and they do not have to be small. If hunting weapons do not have these properties, who would want to buy them for other purposes?<br />
<br />
On the externality problem, we can be more inventive. A standard tool for dealing with externalities is the Pigouvian tax. Tax the source of the bad externality, and you get less of it. How big should the tax be? An unusual problem here is that the size of the externality is random - every gun is not going to injure or kill someone. There's also an inherent moral hazard problem, in that the size of the externality depends on the care taken by the gunowner. Did he or she properly train himself or herself? Did they store their weapon to decrease the chance of an accident?<br />
<br />
What's the value of a life? I think when economists ask that question, lay people are offended. I'm thinking about it now, and I'm offended too. If someone offered me \$5 million for my cat, let alone another human being, I wouldn't take it.<br />
<br />
In any case, the Pigouvian tax we would need to correct the externality should be a large one, and it could generate a lot of revenue. If there are 300 million guns in the United States, and we impose a tax of \$3600 per gun on the current stock, we would eliminate the federal government deficit. But \$3600 is coming nowhere close to the potential damage that a single weapon could cause. A potential solution would be to have a gun-purchaser post collateral - several million dollars in assets - that could be confiscated in the event that the gun resulted in injury or loss of life. This has the added benefit of mitigating the moral hazard problem - the collateral is lost whether the damage is "accidental" or caused by, for example, someone who steals the gun.<br />
<br />
Of course, once we start thinking about the size of the tax (or collateral) needed to correct the inefficiency that exists here, we'll probably come to the conclusion that it is more efficient just to ban particular weapons and ammunition at the point of manufacture. I think our legislators should take that as far as it goes.<br />
<div style="clear: both;">
</div>
</div>
<div class="post-footer" style="background-color: white; color: #999999; font-family: 'Trebuchet MS', Trebuchet, Arial, Verdana, sans-serif; font-size: 10px; letter-spacing: 0.1em; line-height: 1.4em; margin: 0.75em 0px; text-transform: uppercase;">
</div>
</blockquote>
<br />David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-39522822603925444892012-12-21T06:46:00.000-05:002012-12-21T06:47:42.524-05:00Graph of the DayToday's graphic demonstrates the use of Pandas to grab data from Yahoo!Finance. The <a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">code</a> I wrote uses <span style="font-family: Courier New, Courier, monospace;">pandas.io.get_data_yahoo()</span> to grab historical daily data on the S&P 500 index and then generates a simple time series plot. I went ahead and added the NBER recession bars for good measure. Note the use of a logarithmic scale on the vertical axis. Enjoy!<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-21-SP500.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-21-SP500.png" width="400" /></a></div>
David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-20618239005865042742012-12-20T05:54:00.000-05:002012-12-20T05:54:13.862-05:00Graph of the DayToday's graph is a combined 3D plot of the production frontier associated with the constant returns to scale Cobb-Douglas production function and a contour plot showing the <a href="http://en.wikipedia.org/wiki/Isoquant" target="_blank">isoquants</a> of the production frontier. This static snapshot was written up using matplotlib (the <a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">code</a> also includes an interactive version of the 3D production frontier implemented in <a href="http://docs.enthought.com/mayavi/mayavi/index.html" target="_blank">Mayavi</a>).<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-20-CRTS-production-frontier.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="200" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-20-CRTS-production-frontier.png" width="400" /></a></div>
<br />
At some point I will figure out how to embed the interactive Mayavi plot into a blog post so that readers can manipulate the plot and change parameter values. If anyone knows how to do this already, a pointer would be much appreciated!David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0tag:blogger.com,1999:blog-325397004329727405.post-4717647990971729652012-12-19T10:18:00.000-05:002012-12-23T06:52:43.890-05:00Blogging to resume again!It has been far too long since my last post. Life (becoming a father), travel (summer research trip to SFI), teaching (am teaching a course on Computational Economics), and research (also trying to finish my PhD!) have a way of getting in the way of my blogging. As a mechanism to slowly move back into the blog world, I have decided to start a 'Graphic of the Day' series. Each day I will create a new economic graphic using my favorite Python libraries (mostly <a href="http://pandas.pydata.org/">Pandas</a>, <a href="http://matplotlib.org/">matplotlib</a>, <a href="http://www.scipy.org/Download">NumPy/Scipy</a>).<br />
<br />
The inaugural 'Graph of the Day' is Figure 1-1 from Mankiw's intermediate undergraduate textbook <i>Macroeconomics</i>.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-19-Mankiw-Figure-1-1.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="300" src="https://raw.github.com/davidrpugh/beyond-microfoundations/master/Graphs%20of%20the%20day/2012-12-19-Mankiw-Figure-1-1.png" width="400" /></a></div>
<br />
Real GDP measures the total income of everyone in the economy, and real GDP per person measures the income of the average person in the economy. The figure shows that real GDP per person tends to grow over time and that this normal growth is sometimes interrupted by period of declining income (i.e., the grey NBER bars!), called recessions or depressions. <br />
<br />
Note that Real GDP per person is plotted on a logarithmic scale. On such a scale equal distances on the vertical axis represent equal <i>percentage </i>changes. This is why the distance between \$8,000 and \$16,000 (a 100% increase) is the same as the distance between \$32,000 and \$64,000 (also a 100% increase).<br />
<br />
The Python code is available on <a href="https://github.com/davidrpugh/beyond-microfoundations/tree/master/Graphs%20of%20the%20day" target="_blank">GitHub</a> for download (I used <span style="font-family: Courier New, Courier, monospace;">pandas.io.data.get_data_fred()</span> to grab the data). The graphic is a bit boring. I was a bit depressed to find that the longest time series for U.S. per capita real GDP only goes back to 1960! This seems a bit scandalous...but perhaps I was just using the wrong data tags!David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com1tag:blogger.com,1999:blog-325397004329727405.post-75594564334354314452012-06-07T16:13:00.000-04:002012-12-08T09:22:13.838-05:005-6 June 2012 at the 2012 CSSS...For the past two days the focus has been on tools for analyzing non-linear dynamical (and specifically chaotic) systems. We have been using a program called <a href="http://www.mpipks-dresden.mpg.de/~tisean/Tisean_3.0.1/index.html">TISEAN</a> to do most of the analysis. There also seem to be a number of <i>R</i> packages for doing non-linear times series analysis: <i><a href="http://cran.r-project.org/web/packages/RTisean/index.html">RTisean</a></i>, <i><a href="http://cran.r-project.org/web/packages/tsDyn/index.html">tsDyn</a></i>, <i><a href="http://cran.r-project.org/web/packages/tseriesChaos/index.html">tseriesChaos</a></i>, etc.<br />
<h3>
<b>Henon Map: </b></h3>
Our first task was to simply use TISEAN to generate some trajectories of the <a href="http://en.wikipedia.org/wiki/H%C3%A9non_map">Henon map</a>, plot them using our favorite plotting tool (at the moment I am working on improving my Python coding so I am using matplotlib) and then analyze the <a href="http://en.wikipedia.org/wiki/Spectral_density">power spectrum</a> (sometimes called spectral density). TISEAN's version of the Henon map is:<br />
<br />
<div style="text-align: center;">
x<sub>t+1</sub> = 1 - Ax<sub>t</sub><sup>2</sup> + By<sub>t</sub></div>
<div style="text-align: center;">
y<sub>t+1</sub> = x<sub>t</sub></div>
<br />
For A=0.8 and B=0, the attractor is a simple 2-cycle which means that a plot of the trajectory of the map in state space will yield two points:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgeO-cji1Gc4cgKP-TOZzDev6Ew4-gfEIWscbsyjYZSgl4n5EmmSBj5VmMivdCtN9tQ5Lu8lAq64C_lnoUQPC_gMD4iCG98EGVqvOSFj5Yuchi4WmKEQljKi0ZgdFJIKCZUJibuq09tFTw/s1600/Henon+trajectory+for+A+=+0.8,+B+=+0.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgeO-cji1Gc4cgKP-TOZzDev6Ew4-gfEIWscbsyjYZSgl4n5EmmSBj5VmMivdCtN9tQ5Lu8lAq64C_lnoUQPC_gMD4iCG98EGVqvOSFj5Yuchi4WmKEQljKi0ZgdFJIKCZUJibuq09tFTw/s320/Henon+trajectory+for+A+=+0.8,+B+=+0.png" width="320" /></a></div>
However, for A=1.4 and B=0.3, the Henon map displays chaotic dynamics:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi3jhWLyfT5yYnxvYGy2J-tqhRBOc2_19GxszVk6wvit-bKZH7hna4sJuclB-WDB_kWjCuFpmrtYYcRs9x1osdzgj6iLtxcfFTqiSzXW5E4JxddxtswShHSM8tHzs43BGtjgKtW4CSHjCw/s1600/Henon+trajectory+for+A+=+1_4,+B+=+0_3.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em; text-align: center;"><img border="0" height="265" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi3jhWLyfT5yYnxvYGy2J-tqhRBOc2_19GxszVk6wvit-bKZH7hna4sJuclB-WDB_kWjCuFpmrtYYcRs9x1osdzgj6iLtxcfFTqiSzXW5E4JxddxtswShHSM8tHzs43BGtjgKtW4CSHjCw/s400/Henon+trajectory+for+A+=+1_4,+B+=+0_3.png" width="400" /></a></div>
<b>Power Spectrum: </b>A good place to start the analysis of times series data is to examine the power spectrum (or spectral density) of the data. For the Henon map with A=0.8 and B=0, the attractor is a 2-cycle which implies that the dominant frequency should be 1/2. <br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgY3upiwDW-okJkcU5mdaxdIVYKqCKTH2MG-ueXVlog6WPtHCMXaAq9ktatJx2lb15SHLp8JQNF8Lpo1o4ZYmmvq5o89bF0DnU-nFAhIhRzvljaC-J98-wYLKbAKI7SxVXW6DcUPgvOKlU/s1600/Power+Spectrum+of+Henon+Trajectory+(2-cycle).png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="212" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgY3upiwDW-okJkcU5mdaxdIVYKqCKTH2MG-ueXVlog6WPtHCMXaAq9ktatJx2lb15SHLp8JQNF8Lpo1o4ZYmmvq5o89bF0DnU-nFAhIhRzvljaC-J98-wYLKbAKI7SxVXW6DcUPgvOKlU/s320/Power+Spectrum+of+Henon+Trajectory+(2-cycle).png" width="320" /></a></div>
Note the above plot has a single "spike" at a frequency of 0.5. What other frequencies are present in the times series generated by the Henon map? Given that the map is a 2-cycle, in theory, there should be only a single frequency present in the data. Although my computer can represent much smaller numbers, the smallest number that my computer can distinguish as being <i>distinct </i>(i.e., my machine ε) is 2.2204460492503131e-16. The other "spikes" in the above plot are thus non-sensical results of my computer doing calculations with numbers that are too small for it to handle properly. This is a good example of <a href="http://en.wikipedia.org/wiki/Arithmetic_underflow">arithmetic underflow</a>. <br />
<br />
Given that the Henon map with A=1.4 and B=0.3 exhibits chaotic dynamics we expect that the power spectrum should exhibit spikes at all frequencies.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8ryfssS2s2k3RRZbKm2uj5M5YDIKcBS0vO7rGBimmXjmvnPt5_466pzf8jxi0b6ns-3f8cbesRP2EAFdluIIdOrZshSeYMjNxg5OVpwjpO-x9Ydm_lV-IszSPQL-X8hr_mw8Dj-Wasrc/s1600/Power+Spectrum+of+Henon+Trajectory+(chaotic).png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj8ryfssS2s2k3RRZbKm2uj5M5YDIKcBS0vO7rGBimmXjmvnPt5_466pzf8jxi0b6ns-3f8cbesRP2EAFdluIIdOrZshSeYMjNxg5OVpwjpO-x9Ydm_lV-IszSPQL-X8hr_mw8Dj-Wasrc/s320/Power+Spectrum+of+Henon+Trajectory+(chaotic).png" width="320" /></a></div>
<h3>
<b>The Lorenz System: </b></h3>
Next we want to plot some trajectories the <a href="http://en.wikipedia.org/wiki/Lorenz_system">Lorenz System</a> and then analyze the resulting power spectrums. Classic model of chaos developed by <a href="http://en.wikipedia.org/wiki/Edward_Norton_Lorenz">Edward Lorenz</a> to model weather/climate systems. For parameter values R=15, S=16, B=4 the system exhibits a unique fixed point attractor...<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiIh30ngw-rEe8LslMZfkGgSVlIoepl2uLzauqFdF_kV4pUHO0XA5CEN3_ebRXNwoF3iS-rpl0vHZx7fx7sRuNWyPpm0KMULa28G9WUiaJOi7byrgFZYk39yb83bCOuJUMDPR7SM9_X1Q8/s1600/Lorenz+Trajectory+for+R+=+15.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="212" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiIh30ngw-rEe8LslMZfkGgSVlIoepl2uLzauqFdF_kV4pUHO0XA5CEN3_ebRXNwoF3iS-rpl0vHZx7fx7sRuNWyPpm0KMULa28G9WUiaJOi7byrgFZYk39yb83bCOuJUMDPR7SM9_X1Q8/s320/Lorenz+Trajectory+for+R+=+15.png" width="320" /></a></div>
...or in 3D if you prefer:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggD7ZimJfvIehfNVxij0JrkVX0Xac30Ho7Z0DMQHV-rpY5sBuOmo7-UBSk9YG9WYg7Cer8TyOFdMu2vsIPgjTn6uHfzgAqyxOeHDKz-kMJYAsuzXSHk-tuemmKxwOC4JSSskkOiSKyyaA/s1600/3D+plot+of+the+Lorenz+trajectory+for+R+=+15,+S+=+16,+and+B+=+4.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEggD7ZimJfvIehfNVxij0JrkVX0Xac30Ho7Z0DMQHV-rpY5sBuOmo7-UBSk9YG9WYg7Cer8TyOFdMu2vsIPgjTn6uHfzgAqyxOeHDKz-kMJYAsuzXSHk-tuemmKxwOC4JSSskkOiSKyyaA/s400/3D+plot+of+the+Lorenz+trajectory+for+R+=+15,+S+=+16,+and+B+=+4.png" width="400" /></a></div>
The power spectrum for the unique fixed point attractor looks as follows:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEixNZM8OgEyH-A3YAT2Gzlnr-K3TMaLlWl2E9Zc9fTvqsOiO7cZ6XCtLocQ8V0dTVmulL77XJtY3zECmYJfduwOvazg7gIBTVNvqIp8rwPWUO2yMlU6T3P5tt-2GswwSkwFU4LeYeL8uqA/s1600/Power+Spectrum+of+Lorenz+Trajectory+for+R+=+15.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEixNZM8OgEyH-A3YAT2Gzlnr-K3TMaLlWl2E9Zc9fTvqsOiO7cZ6XCtLocQ8V0dTVmulL77XJtY3zECmYJfduwOvazg7gIBTVNvqIp8rwPWUO2yMlU6T3P5tt-2GswwSkwFU4LeYeL8uqA/s320/Power+Spectrum+of+Lorenz+Trajectory+for+R+=+15.png" width="320" /></a></div>
One gets much more interesting dynamics out of the Lorenz system simply by changing R. For parameter values R=45, S=16, B=4 the system exhibits chaos:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh21hwznGmx5Wmn_-gCMT4XajbTALgdUOb8hFSpNATfMDePqvDiXcnz66g46-UVw1xpKxEgqcV3JM3ZEgCnSuDZRuSru1-o18RWkMmh-s1u0t0LjNoXIVNWkIKDE1mGA4ybTWldOcxxey8/s1600/Lorenz+Trajectory+for+R+=+45.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="212" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh21hwznGmx5Wmn_-gCMT4XajbTALgdUOb8hFSpNATfMDePqvDiXcnz66g46-UVw1xpKxEgqcV3JM3ZEgCnSuDZRuSru1-o18RWkMmh-s1u0t0LjNoXIVNWkIKDE1mGA4ybTWldOcxxey8/s320/Lorenz+Trajectory+for+R+=+45.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhOdp44NaPMM7XHLDzIQZJfnmS_fMhkpEz1E1Lqd7ZjdByPzA-d9ndsZrjbejkz6m0uanMRD0SErzJPahk3cNM2vomfual_IOJqzM6JpsBhwiUEbBNx8r4jPk8IYuO1OJCmJ3WzmTOtsz8/s1600/3D+plot+of+the+Lorenz+trajectory+for+R+=+45,+S+=+16,+and+B+=+4.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhOdp44NaPMM7XHLDzIQZJfnmS_fMhkpEz1E1Lqd7ZjdByPzA-d9ndsZrjbejkz6m0uanMRD0SErzJPahk3cNM2vomfual_IOJqzM6JpsBhwiUEbBNx8r4jPk8IYuO1OJCmJ3WzmTOtsz8/s400/3D+plot+of+the+Lorenz+trajectory+for+R+=+45,+S+=+16,+and+B+=+4.png" width="400" /></a></div>
For R=45, the power spectrum exhibits power at all frequencies:<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4RI2ODLYGAGmaVT5Q7XsuwlelI69ROFT-bCgy5v9q4g95xbiVs_QZT5cUXbxqJCDIEgcxjASmyBzAN35_dyRB3tzfVQZNtrljYN2V-tXMvZT2yaCB6URKFzyXaP_oRNWNJbahNPAgLfc/s1600/Power+Spectrum+of+Lorenz+Trajectory+for+R+=+45.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh4RI2ODLYGAGmaVT5Q7XsuwlelI69ROFT-bCgy5v9q4g95xbiVs_QZT5cUXbxqJCDIEgcxjASmyBzAN35_dyRB3tzfVQZNtrljYN2V-tXMvZT2yaCB6URKFzyXaP_oRNWNJbahNPAgLfc/s320/Power+Spectrum+of+Lorenz+Trajectory+for+R+=+45.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<br /></div>
<div class="separator" style="clear: both; text-align: left;">
Python code (and the data files if you do not have TISEAN installed) for replicating the above can be found <a href="https://sites.google.com/site/beyondmicrofoundationscoderepo/home/python/dynamics">here</a>.</div>
David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com1tag:blogger.com,1999:blog-325397004329727405.post-45295298771548942652012-06-04T23:29:00.000-04:002012-06-05T16:15:28.235-04:00Monday 4 June 2012 at the 2012 CSSS...Today's lecture, given by <a href="http://santafe.edu/about/people/profile/Elizabeth%20Bradley">Prof. Liz Bradley</a>, focused on the basics of non-linear (mostly chaotic) dynamics in both discrete and continuous time. Much (all?) of the material in the lecture I had already encountered before in my own reading, but it was nice to get a refresher. What follows is my summary of our discussion of the dynamical properties of that classic example of chaotic dynamics in discrete time, the <a href="http://en.wikipedia.org/wiki/Logistic_map">logistic map</a>. The seminal reference for the logistic map is probably the <a href="http://www.math.miami.edu/~hk/csc210/week3/May_Nature_76.pdf">1976 <i>Nature</i></a> paper by Robert May.<br />
<br />
<b>The Logistic Map </b>is an innocent looking non-linear equation:<br />
<br />
<div style="text-align: center;">
X<sub>t+1</sub> = <i>r </i>X<sub>t </sub>(1 - X<sub>t</sub>)
<br />
<br />
<div style="text-align: left;">
where the state space of the model is the unit interval [0, 1] and <i>r </i>is a parameter that varies on (0, 4]. </div>
<div style="text-align: left;">
<br /></div>
<div style="text-align: left;">
<b>Some trajectories of the logistic map for various values of <i>r</i>:</b></div>
<div style="text-align: left;">
If <i> </i>0 < <i>r </i>≤ 1, then the dynamics are trivial: the model will converge to X = 0 no matter the initial condition. Suppose the <i>r</i> = 2. In this case, the dynamics of the model are also pretty boring: convergence to a unique fixed point. No matter the initial condition, if <i>r = 2</i>, all trajectories of the logistic map will converge (quite quickly) on a unique steady state value of 1/2.</div>
<div style="text-align: left;">
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGF43xTx9zPdoAhV4mjNkSNC4UwnJxxCR-GAA1t4RdmbXPnne2pEp89bbrlxCjztI6mJJFlurrhlmdTOhzxJWFLGvnRLJY4w7AgEjiPVQizNt5-nLjbJF5r65ArvtKhtAgQt-XZ6EkWjA/s1600/Logistic+Map+Trajectory.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGF43xTx9zPdoAhV4mjNkSNC4UwnJxxCR-GAA1t4RdmbXPnne2pEp89bbrlxCjztI6mJJFlurrhlmdTOhzxJWFLGvnRLJY4w7AgEjiPVQizNt5-nLjbJF5r65ArvtKhtAgQt-XZ6EkWjA/s320/Logistic+Map+Trajectory.png" width="320" /></a></div>
<table align="center" cellpadding="0" cellspacing="0" class="tr-caption-container" style="margin-left: auto; margin-right: auto; text-align: center;"><tbody>
<tr><td style="text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkw2QPPDrRREbtgoN8ekZ9wJU-GJMKgCtkI6jHjCyKVOmpVryax-geIQSA0z9UCLV7C51u4_lmIkicCvx_jOK4JwVTqYjtzv-deTRnrrtwjDHypLghSVD4hMVCSs3vYUbciD8x5QrFBtg/s1600/Trajectory+of+Logistic+Map+in+Phase+Space.png" imageanchor="1" style="margin-left: auto; margin-right: auto;"><img border="0" height="210" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkw2QPPDrRREbtgoN8ekZ9wJU-GJMKgCtkI6jHjCyKVOmpVryax-geIQSA0z9UCLV7C51u4_lmIkicCvx_jOK4JwVTqYjtzv-deTRnrrtwjDHypLghSVD4hMVCSs3vYUbciD8x5QrFBtg/s320/Trajectory+of+Logistic+Map+in+Phase+Space.png" width="320" /></a></td></tr>
</tbody></table>
</div>
<div style="text-align: center;">
<div style="text-align: left;">
Now suppose that <i>r = </i><span style="text-align: center;">2.919149. In this case, the result is still convergence to a unique fixed point, however the dynamics are more interesting: the trajectories now exhibit dampened oscillations.</span></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjlHq49aoDXx2n15E4F2HamgNi-F31YvpgWymdWLXk0W6p25k9cWd4VGfrsYRvgImJDaEfR-zS06XAIu45qmKKSU26qwFbC7IrnQMzz21-ZzgtihaTi8j7Hgvi-g0LAagc4GfpSp2FSUaU/s1600/Logistic+Map+Trajectory.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjlHq49aoDXx2n15E4F2HamgNi-F31YvpgWymdWLXk0W6p25k9cWd4VGfrsYRvgImJDaEfR-zS06XAIu45qmKKSU26qwFbC7IrnQMzz21-ZzgtihaTi8j7Hgvi-g0LAagc4GfpSp2FSUaU/s320/Logistic+Map+Trajectory.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhFjkQmSdUpBVQpOC6TZO6SA0Ikk-wI9ZmWvdq7BaCkO3xK7zWqvaK3X7p1e83kNT6uIGs4twoz1sgfVqAb0b2ud80rv_UjngLgcqaLI2TicO8tZIZkbUNuFyjbI5P8qIzG-D9hEpN0wcI/s1600/Trajectory+of+Logistic+Map+in+Phase+Space.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhFjkQmSdUpBVQpOC6TZO6SA0Ikk-wI9ZmWvdq7BaCkO3xK7zWqvaK3X7p1e83kNT6uIGs4twoz1sgfVqAb0b2ud80rv_UjngLgcqaLI2TicO8tZIZkbUNuFyjbI5P8qIzG-D9hEpN0wcI/s320/Trajectory+of+Logistic+Map+in+Phase+Space.png" width="320" /></a></div>
<div style="text-align: left;">
For values of <i>r</i> satisfying 3 < <i>r</i> < 3.45 (roughly!), you get a 2-cycle:</div>
<div style="text-align: left;">
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg290jQXlJb0u-wqQYA6XnRrNjk-QX2NJpzi0hE0sNV5M_GkoV2Rnrkb6fAB8BX623Ydj4Zuyhe_qe4Fp4v1sayyrN35Eezvxc2sGXq8ry1_KV83otEGJt9Hgdxih4mmx3qvXGVSlGecvo/s1600/Logistic+Map+Trajectory.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg290jQXlJb0u-wqQYA6XnRrNjk-QX2NJpzi0hE0sNV5M_GkoV2Rnrkb6fAB8BX623Ydj4Zuyhe_qe4Fp4v1sayyrN35Eezvxc2sGXq8ry1_KV83otEGJt9Hgdxih4mmx3qvXGVSlGecvo/s320/Logistic+Map+Trajectory.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiN-x-HHGMO960ScLqARErfySJgrhkk1W-X8wUcEQZjRewBz1Ay-jAVqZZVWSETaIQs2CA9QyhQ4dGbFY61e1jPhL_mjQU3-UxtKUJhs7BcJNCb1NjEXfzi54l5pqU9Gi1WtyCHEwS6Tls/s1600/Trajectory+of+Logistic+Map+in+Phase+Space.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiN-x-HHGMO960ScLqARErfySJgrhkk1W-X8wUcEQZjRewBz1Ay-jAVqZZVWSETaIQs2CA9QyhQ4dGbFY61e1jPhL_mjQU3-UxtKUJhs7BcJNCb1NjEXfzi54l5pqU9Gi1WtyCHEwS6Tls/s320/Trajectory+of+Logistic+Map+in+Phase+Space.png" width="320" /></a></div>
For <i>r </i>= 3.8285, one gets the famous 3-cycle which is one of the generally accepted indicators of chaotic dynamics (note I have changed the initial condition from 0.2 to 0.5 to eliminate the transient and thus making the cobweb diagram a bit cleaner):</div>
<div style="text-align: left;">
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmLtUWalUVG9T_TMVVilQgnEvXzbOD8WQfPqceJyHi4NuG0DLV58i3P_ZJbCdWfrGtx0hNPkfHfQXxVzxDjm7lCJJQpW__J6T6imHPb7r2HMS34h-RToWBU7FDdwgFbIES78ezz-JpiGs/s1600/Logistic+Map+Trajectory.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhmLtUWalUVG9T_TMVVilQgnEvXzbOD8WQfPqceJyHi4NuG0DLV58i3P_ZJbCdWfrGtx0hNPkfHfQXxVzxDjm7lCJJQpW__J6T6imHPb7r2HMS34h-RToWBU7FDdwgFbIES78ezz-JpiGs/s320/Logistic+Map+Trajectory.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjoa3a4ArHSCoiqvWFp0zZpBN8uVILL3AeFjVss57Pjukka_uaSwJG-WjfzvOm_MWVupqdGnTh75fOBgFHWXo9Y67SIwTIMdOVSO0rDts_ZXPvjbZoB5yFOkmOTvBVy6kDl6CxihU7VSq4/s1600/Trajectory+of+Logistic+Map+in+Phase+Space.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjoa3a4ArHSCoiqvWFp0zZpBN8uVILL3AeFjVss57Pjukka_uaSwJG-WjfzvOm_MWVupqdGnTh75fOBgFHWXo9Y67SIwTIMdOVSO0rDts_ZXPvjbZoB5yFOkmOTvBVy6kDl6CxihU7VSq4/s320/Trajectory+of+Logistic+Map+in+Phase+Space.png" width="320" /></a></div>
Finally, for <i>r</i> = 4 we get an example of chaotic dynamics. Note that the deterministic trajectory for the logistic map looks incredibly "random."<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgEUcS-ujTeVnkj63q1GofxdKzvO1qsMnpMk2PXhvk0JZnOA1qvq807Q9BmGZCVhGrT6eFXISBnyOZv_StZqXtUGnP51hjEMAL2dFxQ78yh2tGPP7unycRpPTZwXjx3C8-anxg7qsC4kOI/s1600/Logistic+Map+Trajectory.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgEUcS-ujTeVnkj63q1GofxdKzvO1qsMnpMk2PXhvk0JZnOA1qvq807Q9BmGZCVhGrT6eFXISBnyOZv_StZqXtUGnP51hjEMAL2dFxQ78yh2tGPP7unycRpPTZwXjx3C8-anxg7qsC4kOI/s320/Logistic+Map+Trajectory.png" width="320" /></a></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiqQRyJbilXZSEevR2hLyqHnlEqe7Yy5cExkE_gcaBm_kJa1P3fv_ZPvHNYfYaWIflJhyphenhyphenwRF4F2Goj1ZzX_pgzclqYx7HHS_JzcgiWffks4CprSClgOzGC7k0wRFVuKT9fQMlUEKmJYctY/s1600/Trajectory+of+Logistic+Map+in+Phase+Space.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="213" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiqQRyJbilXZSEevR2hLyqHnlEqe7Yy5cExkE_gcaBm_kJa1P3fv_ZPvHNYfYaWIflJhyphenhyphenwRF4F2Goj1ZzX_pgzclqYx7HHS_JzcgiWffks4CprSClgOzGC7k0wRFVuKT9fQMlUEKmJYctY/s320/Trajectory+of+Logistic+Map+in+Phase+Space.png" width="320" /></a></div>
<b>3D plot of the phase space for the logistic map:</b><br />
One of the reasons why models with chaotic dynamics, such as the logistic map, exhibit sensitive dependence to initial conditions, is that such maps repeatedly "stretch and fold" the state space over which they are defined. While 2D phase plots give a sense of how the logistic map "stretches" the state space, a 3D phase plot is a really cool way to see how the logistic map "folds" the state space. <br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjEEdSuEdIVNyvNrcyuPAH7aZRhCmLtOOykxcDwWe799BM5jkg04shyphenhyphenWicnvBd6LUBVYWvSzlGmnXs07-wi-BsAshwIvSKzQXxNbubR5LVfcyxaOmzvvHR-Q0AKAuCiv7v0hU4SpKzyN1g/s1600/3D+Phase+Plot+of+Logistic+Map.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="266" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjEEdSuEdIVNyvNrcyuPAH7aZRhCmLtOOykxcDwWe799BM5jkg04shyphenhyphenWicnvBd6LUBVYWvSzlGmnXs07-wi-BsAshwIvSKzQXxNbubR5LVfcyxaOmzvvHR-Q0AKAuCiv7v0hU4SpKzyN1g/s400/3D+Phase+Plot+of+Logistic+Map.png" width="400" /></a></div>
If you are interested, the Python code to replicate the above graphics can be found <a href="https://sites.google.com/site/beyondmicrofoundationscoderepo/home/python/dynamics">here</a>. I am still working on the code for the bifurcation diagram, estimating <a href="http://en.wikipedia.org/wiki/Feigenbaum_constant">Feigenbaum's constant</a>, and for calculating <a href="http://en.wikipedia.org/wiki/Lyapunov_exponent">Lyapunov exponents</a>. </div>
</div>David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com11160 Camino De Cruz Blanca, Saint Johns College, Santa Fe, NM 87505, USA35.666849919397016 -105.9124088287353535.660399919397015 -105.92227932873536 35.673299919397017 -105.90253832873535tag:blogger.com,1999:blog-325397004329727405.post-59374743842111618112012-06-01T17:33:00.000-04:002012-06-01T17:33:39.917-04:00Santa Fe bound...Stopping over in Alexandria, VA to visit friends on the way to Santa Fe for the 2012 CSSS. Posting will be more frequent (I hope!) over the next month. David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com1tag:blogger.com,1999:blog-325397004329727405.post-62222685220312962302012-03-16T08:59:00.000-04:002012-03-16T08:59:46.257-04:00Equity returns: where power-laws go to die? Maybe...A major criticism of previous empirical work assessing support for the power-law as a model for equity returns is that very few (any?) of the studies assess goodness-of-fit of the power-law model. This post summarizes the results of goodness-of-fit testing for the power-law as a model for equity returns and is a continuation of my <a href="http://beyondmicrofoundations.blogspot.com/2012/02/power-laws-equity-returns-and-fallacy.html">previous</a> <a href="http://beyondmicrofoundations.blogspot.com/2012/02/can-better-choice-of-threshold-save.html">posts</a> on the power-law as a model for equity returns using data on U.S. equities listed on the <a href="http://en.wikipedia.org/wiki/Russell_1000_Index">Russell 1000 index</a>.<br />
<br />
I implement two versions of the KS goodness-of-fit tests suggested in <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a>:<br />
<ul><li>Parametric version: conceptually and computationally simple; assumes estimated x<sub>min</sub> is the "true" power-law threshold and simulates data under the null hypothesis of a power-law with exponent equal to estimated α</li>
<li>Non-parametric version: conceptually less straight-forward and computationally <i>very</i> intensive because it makes use of the fact that x<sub>min</sub> is estimated from the data.</li>
</ul><b>Results:</b><br />
Using the parametric version of the KS goodness-of-fit test I am able to reject the power-law model as plausible (i.e., <i>p</i>-value ≤ 0.10) for 17% of positive tails and 12% of negative tails. I would not describe these results as overwhelmingly "anti-power-law," but I would remind readers that the parametric version of the goodness-of-fit test sets a lower-bound on support for the power-law (see my discussion of goodness-of-fit results for <a href="http://beyondmicrofoundations.blogspot.com/2012/03/zipfs-law-does-not-hold-for-mutual.html">mutual funds</a> for more details). Here is a histogram of the <i>p-</i>values for the positive and negative tails of equities in my sample: <br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhdWl5Mjsb9vii-RRnxHTB9aMD-DUQu2OQQXew78sbZx5CxgB2c8MrVskzXvOqUyX5IbhwsrKKlaSauwjYNZpL1AvXHPe8nHFfrbIkyOBuG8DwE4uNyl4VRvvHJtYANcb80N-Ty63VfmdI/s1600/histogramOfpvalues2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhdWl5Mjsb9vii-RRnxHTB9aMD-DUQu2OQQXew78sbZx5CxgB2c8MrVskzXvOqUyX5IbhwsrKKlaSauwjYNZpL1AvXHPe8nHFfrbIkyOBuG8DwE4uNyl4VRvvHJtYANcb80N-Ty63VfmdI/s400/histogramOfpvalues2.png" width="400" /></a></div>The results obtained using the more flexible non-parametric KS test are much less supportive of the power-law model. I reject the power-law model as plausible for roughly 44% of positive tails and 37% of negative tails. Here is another histogram of the goodness-of-fit <i>p</i>-values: <br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjIcOxeTSyxgTZ8mNviuD28l97-4OmRMsleD2FA9V6OdXoecw1UsL5cfoTtPN1M6jYiY3nQcxaYnj90CpL3DGfazba-RyWbIzgfeKyNYOIZmZdtG4bQIXnWDAunl49aBefqiEFkv2QvKqU/s1600/histogramOfpvalues.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjIcOxeTSyxgTZ8mNviuD28l97-4OmRMsleD2FA9V6OdXoecw1UsL5cfoTtPN1M6jYiY3nQcxaYnj90CpL3DGfazba-RyWbIzgfeKyNYOIZmZdtG4bQIXnWDAunl49aBefqiEFkv2QvKqU/s400/histogramOfpvalues.png" width="400" /></a></div>Clearly the non-parametric goodness-of-fit results are more damning for the power-law model than the parametric results. But why? <br />
<br />
Discrepancy could be due to sample size effects. The average number of tail observations for a given stock in my sample is 313 (positive tail) and 333 (negative tail): using daily data, I simply do not have that many tail observations to work with. Because of the small sample size I need to rely the extra flexibility of the non-parametric version of the goodness-of-fit to be able to reject the power-law model as plausible. I suspect that this would not be the case if I had access to the TAQ data set used by <a href="http://polymer.bu.edu/hes/articles/ggps03.pdf">Gabaix et al (<i>Nature</i>, 2003)</a>. It is also worth noting that the more data I get, the more likely I am to reject the power-law as plausible. Specifically, for those equities for which I reject the power-law model as plausible, either for positive or negative tail (or both), I have a larger average number of both total observations and observations in the tail. This sample size dynamic has been something I have observed fairly consistently whilst fitting and testing power-law models to various data sets. The more data I obtain, the less support I find for a power-law and the more support I tend to find for heavy-tailed alternatives (particularly the log-normal). <br />
<br />
While I interpret the above results of goodness-of-fit tests as being decidedly against the hypothesis that the power-law model is "universal," clearly the power-law is a plausible model for either the positive or negative (or both) tails of returns for some stocks. In my mind this immediately suggests that there might be meaningful heterogeneity in the tail behavior of asset returns and that an interesting research direction might be to explore economic mechanisms that could generate such diversity of tail behavior. <br />
<br />
I end with a significant disclaimer. I am concerned that by ignoring the underlying time dependence (think "clustered volatility" and mean-reversion) in large returns that my goodness-of-fit test results might be biased against the power-law model. Suppose that <a href="http://polymer.bu.edu/hes/articles/ggps03.pdf">Gabaix et al (<i>Nature</i>, 2003)</a> are correct about equity returns having power-law tails. Given the dependence structure of returns, it could be the case that the typical KS distance between the best-fit power-law model and the "true" power-law is larger than what I estimate in implementing an iid goodness-of-fit test. <br />
<br />
If this is the case, then perhaps the reason I reject the power-law model so often is because my observed KS distance is obtained from fitting a power-law model to dependent data, whereas my bootstrap KS distances are obtained by fitting a power-law to synthetic data that follows a "true" power-law but ignores the underlying dependence structure of returns. In order to address this issue, I will need to develop an alternative goodness-of-fit testing procedure that can mimic the time dependence in the returns data! Fortunately, I have some ideas (good ones I hope!) on how to proceed... <b> </b>David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com1tag:blogger.com,1999:blog-325397004329727405.post-17199747950471349042012-03-12T07:11:00.000-04:002012-03-12T07:11:49.760-04:00Zipf's Law does not hold for mutual funds!<a href="http://polymer.bu.edu/hes/articles/ggps03.pdf">Gabaix et al (<i>Nature</i>, 2003)</a> and <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a> lay out an economic theory of large fluctuations in share prices based, in part, on the assumption that the size (as measured in dollars of assets under management) of investors in asset markets is well approximated by <a href="http://en.wikipedia.org/wiki/Zipf%27s_law">Zipf's law</a> (i..e., a power-law with scaling exponent ζ ≈ 1 or α ≈ 2). Zipf's law has been purported to hold for cities ( <a href="http://psycnet.apa.org/psycinfo/1950-00412-000">Zipf (<i>Addison-Wesley</i>, 1949)</a>, <a href="http://qje.oxfordjournals.org/content/114/3/739.short">Gabaix (<i>QJE</i>, 1999)</a>, <a href="http://www.jstor.org/stable/10.2307/117093">Gabaix (<i>AER</i>, 1999)</a>, <a href="http://ase.tufts.edu/econ/papers/200310.pdf">Gabaix and Ioannides (2004)</a>, <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/zipfCCA-new.pdf">Gabaix (<i>AER</i>, 2011)</a>, etc), firms (<a href="http://chemlabs.nju.edu.cn/Literature/Zipf/Zipf%27s%20law%20in%20income%20distribution%20of%20companies.pdf">Okuyama et al (<i>Physica A, </i>1999)</a>, <a href="http://www.sciencemag.org/content/293/5536/1818.full">Axtell (<i>Science</i>, 2001)</a>, <a href="http://arxiv.org/pdf/cond-mat/0310061.pdf">Fujiwara et al (<i>Physica A, </i>2004)</a>), banks (<a href="http://www.sciencedirect.com/science/article/pii/S0378437103012391">Aref and Pushkin (2004)</a>), and mutual funds (<a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a>). I say purported, because experience has taught me never to believe in a power-law that I haven't estimated myself! <br />
<br />
In this post, I am going to provide evidence against the power-law as an appropriate model for mutual-funds using the data from the same source as <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a>. The figure below shows two survival plots of the size, as measured in terms of $-value of assets under management, of U.S. mutual funds at the end of 2009 using data from <a href="http://www.crsp.com/">CRSP</a>.<sup>1</sup> The top panel shows the entire data set, the second panel shows only upper 20% of mutual funds (roughly those funds with assets under management greater than $1 billion) and is intended to match as closely as possible Figure VII from <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a>.<br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhaEMJWLcSQlCcwAwPb8Subkrv6oH71f1PaCU1fSE8bNW6gIj8AQBzAavKtlTkaBFEWOblRCFtVC0jhWMpW91B9CZx_5CtGTx4jHgwI40tGv-12SNRSfN_6swGtuPn4CWyAfVpElPZi58c/s1600/SurvivalPlots_2009Q4.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhaEMJWLcSQlCcwAwPb8Subkrv6oH71f1PaCU1fSE8bNW6gIj8AQBzAavKtlTkaBFEWOblRCFtVC0jhWMpW91B9CZx_5CtGTx4jHgwI40tGv-12SNRSfN_6swGtuPn4CWyAfVpElPZi58c/s400/SurvivalPlots_2009Q4.png" width="400" /></a></div>Choosing a threshold to include only the largest 20% of mutual funds for a given year, <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a> report an average estimate for the power-law scaling exponent of ζ ≈ 1 (or α ≈ 2) over the period 1961-1999. <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a> estimate α using OLS on the upper CDF of the mutual fund distribution (although they report similar results using the Hill estimator). <br />
<br />
Using my larger data set I estimate, via OLS and choosing the same 20% cut-off criterion (which leaves 1313 observations in the tail), a scaling exponent of ζ = 1.11 (or α ≈ 2.11). Here is a plot showing my OLS estimates:<br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhG-hc-KqyumJ2yyDJSDc2hZGLSF2RWhmFhtJXzkDNptX7_uuqKc6QmSV8j1KEOoDmNjZOW3lwbjsB0OLfI42wh-EwPW5W3m8iWrS5Bq3z5iOjZqdc6nPhvfByXxJ3zO-Q_SutzvNMq_w8/s1600/PowerLawModelOLS_2009Q4.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhG-hc-KqyumJ2yyDJSDc2hZGLSF2RWhmFhtJXzkDNptX7_uuqKc6QmSV8j1KEOoDmNjZOW3lwbjsB0OLfI42wh-EwPW5W3m8iWrS5Bq3z5iOjZqdc6nPhvfByXxJ3zO-Q_SutzvNMq_w8/s400/PowerLawModelOLS_2009Q4.png" width="400" /></a></div><br />
I estimated the scaling exponent using maximum likelihood in two ways. First, I apply the Hill estimator to the data using the same 20% cut-off as in <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a>; second, I re-estimate the scaling exponent using the Hill estimator, while choosing the threshold parameter to minimize the KS distance as in <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a>. Method 1 obtains an estimate of α = 1.97(3); while method 2 obtains estimates of α = 2.04(3) and x<sub>min</sub> = $1.12 billion (which leaves 1077 observations in the tail). Note that the KS distance, D, for each maximum likelihood fits is smaller than the KS distance obtained using the OLS estimate of α. <br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkc2yohyphenhyphenMbzChGPTLKZ-Fx7S7VrrHAbZPDFLAOIxIXjkSFkS5sZPXXaPc80AXhM8TQ43rEMBl1-sVYSTQyHxxiWse9ElHr9I3nHp5_8gH6Lj4TWul6k4cwk3_BwmWdtBLnH9B48-67-8k/s1600/PowerLawModelsML_2009Q4.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjkc2yohyphenhyphenMbzChGPTLKZ-Fx7S7VrrHAbZPDFLAOIxIXjkSFkS5sZPXXaPc80AXhM8TQ43rEMBl1-sVYSTQyHxxiWse9ElHr9I3nHp5_8gH6Lj4TWul6k4cwk3_BwmWdtBLnH9B48-67-8k/s400/PowerLawModelsML_2009Q4.png" width="400" /></a></div>Numbers in parentheses show the amount of uncertainty in the final digit (obtained using a parametric bootstrap to estimate the standard error).<br />
<br />
Parameter uncertainty is estimated using the bootstrap:<br />
<ul><li>Using 20% cut-off and a parametric bootstrap, I estimate a se for α of 0.026 and a corresponding 95% confidence interval of (1.912, 2.013) </li>
<li>Choosing x<sub>min</sub> via <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a> and using a parametric bootstrap, I estimate a se for α of 0.032 and a corresponding 95% confidence interval of (1.976, 2.098) </li>
<li>Finally, choosing x<sub>min</sub> via <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a> and using a non-parametric bootstrap, I estimate a se for α of 0.059 and a corresponding 95% confidence interval of (1.932, 2.113); se for x<sub>min</sub> of $0.530 B and a corresponding 95% confidence interval of ($0.398 B, 1.332 B)</li>
</ul>Note that in all three cases, the 95% confidence interval for the estimated scaling exponent <i>includes </i>α=2 (i.e., Zipf's "law"). So far so good for <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a>. <br />
However, what about goodness-of-fit? Good data analysis is a lot like good detective work, and it is important to collect as much evidence as possible, relevant to testing the hypothesis at hand, before passing judgement. As stressed in <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a>, an assessment of the goodness-of-fit of the power-law model is an important piece of relevant statistical evidence. Here are my goodness-of-fit test results: <br />
<ul><li>Using a 20% cut-off as suggested in <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a> along with the parametric version of the KS goodness-of-fit test I obtain a <i>p</i>-value of roughly 0.00 using 2500 repetitions, which suggests that the power-law model is not plausible. </li>
<li>Choosing x<sub>min</sub> via <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a> and using the parametric version of the KS goodness-of-fit test I obtain a <i>p</i>-value of roughly 0.19 using 2500 repetitions, which suggests that the power-law model is plausible. </li>
<li>Finally, choosing x<sub>min</sub> via <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a> and using the non-parametric bootstrap version of the KS goodness-of-fit test I obtain a <i>p</i>-value of roughly 0.02 using 2500 repetitions, which again suggests that the power-law model is not plausible. </li>
</ul>On the whole, I think these results are not very supportive of the power-law model. Even though the power-law model remains plausible when I choose x<sub>min</sub> via <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a> and assess goodness-of-fit using the parametric version of the KS test, it is important to note that such an assessment is not properly taking into account the flexibility of the <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a> procedure in choosing the threshold parameter (along with estimating the scaling exponent).<sup>2</sup> Once I take this the additional flexibility into account (i.e., by using the non-parametric KS test), I again find that the power-law model is not plausible! Here is a nice set of density plots of the bootstrap KS distances from each version of the goodness-of-fit test, that illustrates the differences between the parametric and non-parametric procedures (I hope!):<br />
<div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjS4nVWeZK1_unpTQqty2wyH97J5Zfdi4M3cAuTc0AX0zesD-aNNHc8tOldH9wkIIBWCFRKgAV-3E3WA9a_vODj4TLQW1CQD-_Oq_3OxJ4siAG4hALtN2YPixRCfmOOw3NZ3hpGSEWFnJM/s1600/GoF-MutualFunds.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjS4nVWeZK1_unpTQqty2wyH97J5Zfdi4M3cAuTc0AX0zesD-aNNHc8tOldH9wkIIBWCFRKgAV-3E3WA9a_vODj4TLQW1CQD-_Oq_3OxJ4siAG4hALtN2YPixRCfmOOw3NZ3hpGSEWFnJM/s400/GoF-MutualFunds.png" width="400" /></a></div>Note that implementing the non-parametric version of the KS goodness-of-fit test basically shifts and "condenses" the sampling distribution of the KS distance (relative to both parametric versions). Taking into account the additional flexibility of the <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a> procedure for fitting the power-law null model reduces both the mean and variance of sampling distribution of the KS distance, <i>D. </i><br />
<br />
Quick test of alternative hypotheses. A very plausible alternative distribution for mutual funds is the log-normal (recall <a href="http://en.wikipedia.org/wiki/Gibrat%27s_law">Gibrat's law</a> of proportionate growth would predict log-normal). Can I reject the power-law in favour of the log-normal using likelihood ratio tests? YES!<br />
<ul><li>Using a 20% cut-off as suggested in <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a> the Vuong LR test statistic is -3.63 with a two-sided <i>p-</i>value of roughly 0.00 (which implies that, given the data, I can distinguish between the power-law and log-normal) and a one-sided <i>p</i>-value of roughly 0.00 (implying that I can reject the power-law in favour of the log-normal! </li>
<li>Choosing x<sub>min</sub> via <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a> the Vuong LR test statistic is -2.27 with a two-sided <i>p-</i>value of roughly 0.023 (which implies that, given the data, I can distinguish between the power-law and log-normal) and a one-sided <i>p</i>-value of roughly 0.012 (implying that I can reject the power-law in favour of the log-normal!</li>
</ul><div class="separator" style="clear: both; text-align: center;"><a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQBeeeKlFdBJSMlum564Zp8LSvKibU33iDIZ-IUGmBmRLpK_a3qH73buiskEdfhADMTv1ih5QU_BD3fwfEKhrfJF7q0ESo9oeK0Zchh85biBHJlA_n_1RV9LQcYlOjwLzvM-W_J0O6Q8A/s1600/AltHypotheses-MutualFunds.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="400" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjQBeeeKlFdBJSMlum564Zp8LSvKibU33iDIZ-IUGmBmRLpK_a3qH73buiskEdfhADMTv1ih5QU_BD3fwfEKhrfJF7q0ESo9oeK0Zchh85biBHJlA_n_1RV9LQcYlOjwLzvM-W_J0O6Q8A/s400/AltHypotheses-MutualFunds.png" width="400" /></a></div>What are the economic implications or all of this? Does it matter whether or not mutual fund size is distributed according to a log-normal or power-law distribution?<br />
<br />
I think it matters quite a bit for the model put forward in <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a>! In <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a> investors take as given that the distribution of investors' size follows a power-law Specifically, an investor makes use of the distribution of investor size in calculating his optimal trading volume. <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a> relies on the power-law being a "good approximation" to the true distribution of investor size in order to justify investors taking a power-law distribution as given. I have provided evidence that the power-law is <i>not </i>a plausible model, and that a log-normal distribution is a significantly better fit. If the true distribution is not a power-law, then agents in <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a> are effectively solving a mis-specified optimization program and there is no longer any guarantee that the solution to the properly specified optimization program will result in power-law tails for equity and volume (paradoxically, however, this might turn out to be "good" for <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a> in the sense that I have argued in <a href="http://beyondmicrofoundations.blogspot.com/2012/02/power-laws-equity-returns-and-fallacy.html">previous</a> <a href="http://beyondmicrofoundations.blogspot.com/2012/02/can-better-choice-of-threshold-save.html">posts</a> that the tails of equity returns are not power-law anyway!).<br />
<br />
However, whether or not it matters if a distribution is log-normal, power-law, or simply "heavy-tailed" depends on context. In this case a log-normal distribution is consistent with <a href="http://en.wikipedia.org/wiki/Gibrat%27s_law">Gibrat's law</a> of proportionate growth. <a href="http://en.wikipedia.org/wiki/Gibrat%27s_law">Gibrat's law</a> applied to investor size says that if the growth rate of investors' assets under management is independent of the amount of assets currently under management, then the distribution of investor size will follow a log-normal distribution. One could easily test whether or not the growth rate of mutual funds is independent of size. Maybe someone already has?<br />
<ul></ul>Personally, I think the important takeaway from the above analysis is just that there is quite extreme heterogeneity in the size of investors (although not extreme enough to justify a power-law)! In other words, the distribution of investor sizes is generically "heavy-tailed." Investors are not necessarily small relative to the "market" which suggests that at least some investors are unable to take prices parametrically (i.e., as given) when determining their optimal trading behavior. In this respect I wholeheartedly agree with <a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a>: investor size does play a significant role in determining dynamics of asset prices. These results also suggest an alternative way to think about the liquidity of an asset. An asset might be very liquid (i.e., re-saleable) for one investor, but might be very illiquid for another because the desired volume of trade is different! Liquidity might not simply be an inherent property of the asset itself, but may also depend on the "size" of the investor holding it! <br />
<br />
<sup>1</sup> <span style="font-size: xx-small;"><a href="http://pages.stern.nyu.edu/%7Exgabaix/papers/volatility.pdf">Gabaix et al (<i>QJE</i>, 2006)</a> use data on mutual fund assets from 4th quarter of 1999, whereas I use the larger and more recent data set from 4th quarter of 2009.</span><br />
<sup>2</sup><span style="font-size: xx-small;"> Assessing goodness-of-fit using a parametric version of the KS goodness-of-fit test that takes the optimal threshold chosen using the <a href="http://arxiv.org/abs/0706.1062">Clauset et al (<i>SIAM</i>, 2009)</a> method as given is both conceptually easier to understand, and computationally simpler to implement. This procedure also sets an effective lower bar for the plausibility of the power-law model: if the power-law model is not plausible using this parametric KS goodness-of-fit test, then it will be even <i>less</i> plausible if I use the more flexible (and more rigorous) non-parametric KS goodness-of-fit test. </span>David R. Pughhttp://www.blogger.com/profile/09032073870730301659noreply@blogger.com0