Friday 13 January 2017

Ableiten Handel Strategien Aus Wahrscheinlichkeit Verteilung Funktionen

John Ehlers TECHNISCHE PAPIERE John Ehlers, der Entwickler von MESA, hat viele Papiere verfasst und veröffentlicht, die sich auf die Prinzipien der Marktzyklen beziehen. Die Synopse für die verfügbaren Papiere werden unten angezeigt. Laden Sie jedes durch Auswahl ihrer zugehörigen HyperText. Warum Händler Geld verlieren (und was zu tun ist) Ein Artikel in der Mai 2014 Ausgabe von Stock amp Commodities Magazine beschrieb, wie man künstliche Equity-Kurven durch nur die Gewinn-Faktor und Prozent-Gewinner einer Trading-Strategie zu schaffen. Bell Curve Statistiken für den Handel von zufällig ausgewählten Aktien und Portfolio-Trading sind ebenfalls enthalten. Dies ist eine Excel-Kalkulationstabelle, die es Ihnen ermöglicht, diese statistischen Deskriptoren der Trading-Systemleistung zu erleben. Predictive Indicators für effektive Trading-Strategien Technischen Händler verstehen, dass Indikatoren zu glätten Marktdaten nützlich sein müssen, und dass Glättung führt Lag als unerwünschte Nebenwirkung. Wir wissen auch, dass der Markt Fraktal ist eine wöchentliche Intervall-Diagramm sieht aus wie eine monatliche, tägliche oder Intraday-Chart. Was nicht ganz so offensichtlich sein mag, ist, dass mit zunehmendem Zeitintervall entlang der x-Achse auch die hoch-zu-niedrigen Preisschwankungen entlang der y-Achse in etwa proportional zunehmen. Diese spektralen Dilatationserscheinungen bewirken eine unerwünschte Verzerrung, die entweder nicht erkannt wurde oder von Indikatorentwicklern und Markttechnikern weitgehend ignoriert wurde. Ableiten von Handelsstrategien aus gemessenen Wahrscheinlichkeitsdichtefunktionen Dies war der Zweitplatzierte des MTAs 2008 Charles H. Dow Award. In dieser Arbeit zeige ich die Implikationen der verschiedenen Formen der Detrending und wie die resultierenden Wahrscheinlichkeitsverteilungen als Strategien zur Erzeugung wirksamer Handelssysteme verwendet werden können. Die Ergebnisse dieser robusten Handelssysteme werden mit Standardansätzen verglichen. Dieses Papier zeigen und interaktive Art und Weise zu eliminieren so viel Verzögerung wie gewünscht von Glättung Filter. Natürlich kommt reduzierte Verzögerung zu dem Preis der verringerten Filterglätte. Der Filter weist kein transientes Überschwingen auf, das üblicherweise in Filtern höherer Ordnung gefunden wird. Empirischer Modus Zersetzung Ein neuer Ansatz für die Zyklus - und Trendmoduserkennung. Fourier Transform for Traders Das Problem mit Fourier Transform für die Messung von Marktzyklen ist, dass sie eine sehr schlechte Auflösung haben. In dieser Arbeit zeige ich, wie eine andere nichtlineare Transformation verwendet wird, um die Auflösung zu verbessern, so dass die Fourier-Transformationen verwendbar sind. Das gemessene Spektrum wird als Heatmap angezeigt. Swiss Army Knife Indicators Indikatoren sind nur Übertragungsantworten von Eingangsdaten. Durch eine einfache Änderung der Konstanten kann dieser Indikator ein EMA, ein SMA, ein 2-Pole-Gaussian-Tiefpassfilter, ein 2-Pole-Butterworth-Tiefpassfilter, ein FIR-Glättungsmittel, ein Bandpassfilter oder ein Bandstopfilter werden. Ehlers Filter Ein ungewöhnliches nichtlineares FIR-Filter wird beschrieben. Dieser Filter gehört zu den am meisten ansprechenden Preisänderungen, ist aber in den Seitenmärkten glatter. Systemleistungsbewertung Profitfaktor (Bruttogewinn dividiert durch Bruttoverluste) entspricht dem Auszahlungsfaktor beim Spiel. Wenn also der Profitfaktor mit den Prozentgewinnern in einer Reihe von Zufallsereignissen kombiniert wird, können Beispiele dafür gefunden werden, wie ein Aktienstrategie-Aktienwachstum simuliert werden kann. Dieses Papier beschreibt, wie gemeinsame Leistungsbeschreibung auf diese beiden Parameter bezogen sind. Eine Excel-Kalkulationstabelle ist beschrieben, mit der Sie eine Monte Carlo Analyse Ihrer Handelssysteme durchführen können, wenn Sie diese beiden Parameter kennen (out of sample). FRAMA (FRACTAL Adaptive Moving Average). Ein nichtlinear gleitender Durchschnitt wird mit dem Hurst-Exponenten abgeleitet. MAMA ist die Mutter aller adaptiven gleitenden Durchschnitte. Tatsächlich ist der Name ein Akronym für MESA Adaptive Moving Average. Die nichtlineare Wirkung dieses Filters wird durch den Rücklauf der Phase jedes Halbzyklus erzeugt. In Kombination mit FAMA, einem folgenden Adaptive Moving Average, bilden die Crossovers ausgezeichnete Eingangs - und Ausgangssignale, die relativ frei von Whipsaws sind. Verzögerung ohne Raumfahrt Laguerre Polynome werden verwendet, um eine Filterstruktur ähnlich einem einfachen gleitenden Durchschnitt mit dem Unterschied zu erzeugen, dass der Zeitabstand zwischen den Filterabgriffen null ist. Das Ergebnis ermöglicht die Erzeugung sehr kurzer Filter mit den Glättungscharakteristiken viel längerer Filter. Kürzere Filter bedeuten weniger Verzögerung. Die Vorteile der Verwendung der Laguerre-Polynome in Filtern werden sowohl in Indikatoren als auch in automatischen Handelssystemen demonstriert. Der Artikel enthält EasyLanguage-Code. Der CG-Oszillator Der CG-Oszillator ist einzigartig, weil er ein Oszillator ist, der sowohl geglättet als auch verzögert ist. Er findet den Schwerpunkt (CG) der Preiswerte in einem FIR-Filter. Das CG hat automatisch die Glättung des FIR-Filters (ähnlich einem einfachen gleitenden Durchschnitt), wobei die Position des CG genau in Phase mit der Kursbewegung ist. EasyLanguage-Code ist enthalten. Verwenden der Fisher-Transformation Viele Handelssysteme werden unter der Annahme entworfen, dass die Wahrscheinlichkeitsverteilung der Preise eine Normal - oder Gaußsche Wahrscheinlichkeitsverteilung über den Mittelwert aufweist. Tatsächlich könnte nichts weiter von der Wahrheit entfernt sein. Dieses Papier beschreibt, wie die Fisher Transform Daten konvertiert, um fast eine normale Wahrscheinlichkeitsverteilung haben. Wenn die Wahrscheinlichkeitsverteilung nach Anwendung der Fisher-Transformation normal ist, werden die Daten verwendet, um Einstiegspunkte mit chirurgischer Präzision zu erzeugen. Der Artikel enthält EasyLanguage-Code. Die Inverse Fisher-Transformation Die Inverse Fisher-Transformation kann verwendet werden, um einen Oszillator zu erzeugen, der schnell zwischen Überverkauf und Überkauf ohne Peitsche umschaltet. Gaußsche Filter Lag ist der Untergang von Glättungsfiltern. Dieser Artikel zeigt, wie Verzögerung reduziert werden kann und die höchste Glättung der Glätte durch Verringerung der Verzögerung von Hochfrequenzkomponenten in den Daten erhalten wird. Eine vollständige Tabelle von Gaußschen Filterkoeffizienten ist vorgesehen. Pole und Nullen Eine Beschreibung der digitalen Filter in Form von Z Transformationen. Die Verzweigungen von Filtern höherer Ordnung werden beschrieben. Tables of coefficients für 2 Pole und 2 Pole Butterworth-Filter sind gegeben. Professor John Geweke Professional International renommierten Ökonomiker John Geweke kam zu UTS als Distinguished Research Professor in der School of Business im Jahr 2009. Professor Geweke ist für seine Beiträge zur ökonometrischen Theorie in der Zeit unterschieden Serienanalyse und Bayes-Modellierung sowie für Anwendungen in den Bereichen Makroökonomie, Finanzen und Mikroökonomie. Er ist ein Fellow der Ökonometrischen Gesellschaft und der American Statistical Association. Er war Mitherausgeber des Journal of Econometrics, der Zeitschrift für Angewandte Ökonometrie und Herausgeber des Journal of Business and Economic Statistics. Sein aktuelles Buch ist Vollständige und unvollständige ökonometrische Modelle, veröffentlicht von Princeton University Press im Januar 2010. Derzeit leitet er das sechs-investigator ARC gesponserte Projekt, Massively Parallel Algorithmen für Bayesian Inference und Decision Making. Auszeichnungen und Anerkennung Fellow der Econometric Society, seit 1982 Fellow der American Statistical Association, seit 1990 Alfred P. Sloan Research Fellow, 1982-1984 H. I. Romnes Fakultät Fellow, University of Wisconsin, 1982-1983 Dayton-Hudson Fellowship, 1970-1974 National Merit Scholar, 1966-1970 Mitglied, Phi Beta Kappa und Phi Kappa Phi Aufgeführt in Marquis Whos Wer in Amerika, ähnliche Veröffentlichungen Vorherige akademische Positionen Harlan McGregor Lehrstuhl für Volkswirtschaftslehre und Professor für Ökonomie und Statistik, Universität Iowa, 1999-2009 Professor für Wirtschaftswissenschaften, Universität Minnesota, 1990-2001 Direktor, Institut für Statistik und Entscheidungswissenschaften, Duke University, 1987-1990 Professor für Statistik und Entscheidungswissenschaften , Duke University, 1987-1990 Professor für Wirtschaftswissenschaften an der Duke University, 1986-1996 Professor für Wirtschaftswissenschaften an der Duke University, 1983-1986 Gastprofessor für Wirtschaftswissenschaften, Carnegie-Mellon University, 1982-1983 Gastprofessor an der Universität Duke Universität von Wisconsin-Madison, 1982-1983 Professor für Wirtschaftswissenschaften, Universität Wisconsin-Madison, 1979-1982 Gastwissenschaftler an der Warwick University, 1979 Assistenzprofessor für Wirtschaftswissenschaften, Universität Von Wisconsin-Madison, 1975-1979 Distinguished Visiting Professor, Wirtschaftswissenschaft Disziplin Gruppe Associate Mitglied, AAI - Advanced Analytics Institute Dok. Der Philosophie Geweke, J .. Koop, G. amp Van Dijk, H. 2012, Das Oxford-Handbuch der Bayesschen Ökonometrie. ViewDownload from: Verleger-Site ampcopy Oxford University Press 2011. Alle Rechte vorbehalten. Bayesianische ökonometrische Methoden haben in den letzten Jahren eine zunehmende Popularität erlebt. Ökonometriker, empirische Ökonomen und politische Entscheidungsträger nutzen zunehmend bayesianische Methoden. Das Oxford-Handbuch der Bayesschen Ökonometrie ist eine einzige Quelle über Bayes'sche Methoden in Spezialgebieten. Es enthält Artikel von führenden Bayesianern über die neuesten Entwicklungen in ihren spezifischen Fachgebieten. Das Volumen bietet eine breite Abdeckung der Anwendung der Bayesianischen Ökonometrie in den wichtigsten Bereichen der Ökonomie und verwandten Disziplinen, einschließlich Makroökonomie, Mikroökonomie, Finanzen und Marketing. Es überprüft den Stand der Technik in der Bayesschen Ökonometriemethode mit Artikeln über posterioren Simulationen und Markov-Ketten-Monte-Carlo-Verfahren, bayesianischen nichtparametrischen Techniken und den spezialisierten Werkzeugen der Bayesianischen Zeitreihen-Ökonometriker wie staatliche Raummodelle und Partikelfilterung. Es enthält auch Artikel über Bayes'sche Prinzipien und Methodik. Geweke, J. 2010, Komplette und unvollständige ökonometrische Modelle. 1, Princeton University Press, Princeton, USA. ViewDownload from: UTS OPUS Ökonometrische Modelle sind weit verbreitet in der Erstellung und Bewertung der Wirtschaftspolitik im öffentlichen und privaten Sektor eingesetzt. Aber diese Modelle sind nur dann nützlich, wenn sie die betreffenden Phänomene angemessen berücksichtigen, und sie können ziemlich irreführend sein, wenn sie es nicht tun. Als Reaktion haben Ökonometriker Tests und andere Prüfungen für die Modelladäquanz entwickelt. Alle diese Verfahren nehmen jedoch die Spezifikation des zu testenden Modells als gegeben an. In diesem Buch behandelt John Geweke die kritische frühere Phase der Modellentwicklung, der Punkt, an dem potenzielle Modelle inhärent unvollständig sind. Zusammenfassend und ergänzt die jüngsten Fortschritte in der Bayesianischen Ökonometrie, zeigt Geweke, wie einfach moderne Simulationsmethoden den kreativen Prozess der Modellformulierung ergänzen können. Diese Methoden, die sowohl für Wirtschaftswissenschaftler als auch für angewandte Ökonometriker zugänglich sind, rationalisieren die Prozesse der Modellentwicklung und Spezifikationsprüfung. Komplett mit Illustrationen aus einer Vielzahl von Anwendungen, ist dies ein wichtiger Beitrag zur Ökonometrie, die Ökonomen und Doktoranden gleichermaßen interessieren wird. Geweke, J. 2005, Zeitgenössische Bayessche Ökonometrie und Statistik. 1, John Wiley amp Sons Inc, USA. ViewDownload von: Publisher-Website Von der Rückseite: Diese Publikation bietet den Lesern ein gründliches Verständnis der Bayes-Analyse, die in der Theorie der Schlussfolgerung und optimale Entscheidungsfindung ist. Zeitgenössische Bayessche Ökonometrie und Statistik bieten den Lesern modernste Simulationsmethoden und Modelle, die zur Lösung komplexer realer Probleme eingesetzt werden. Mit einer starken Grundlage sowohl in der Theorie als auch in der praktischen Problemlösungs-Tool, Leser entdecken, wie die Entscheidungsfindung zu optimieren, wenn Probleme mit begrenzten oder unvollständigen Daten. Das Buch beginnt mit der Untersuchung der theoretischen und mathematischen Grundlagen der Bayesschen Statistik, damit die Leser verstehen, wie und warum sie bei der Problemlösung verwendet wird. Der Autor beschreibt dann, wie moderne Simulationsmethoden Bayesian Ansätze praktisch machen mit weit verfügbaren mathematischen Anwendungen Software. Darüber hinaus erläutert der Autor, wie Modelle auf spezifische Probleme angewendet werden können, darunter lineare Modelle und politische Entscheidungen, Modellierung mit latenten Variablen und fehlenden Daten, Zeitreihenmodelle und Vorhersage sowie Vergleich und Auswertung von Modellen. Die Publikation wurde entwickelt und verfeinert durch ein Jahrzehnt der Klassenzimmer-Erfahrungen, und Leser finden die author039s Ansatz sehr engagiert und zugänglich. Es gibt fast 200 Beispiele und Übungen, um den Lesern zu helfen, zu sehen, wie effektive Nutzung der Bayes-Statistiken ermöglicht es ihnen, optimale Entscheidungen zu treffen. Matlab und Splus Computerprogramme sind im gesamten Buch integriert. Eine begleitende Website bietet Lesern mit Datensätzen und Computer-Code für viele Beispiele. Diese Publikation ist für Forscher, die Ökonometrie und ähnliche statistische Methoden in ihrer Arbeit verwenden maßgeschneidert. Mit dem Schwerpunkt auf der praktischen Problemlösung und dem umfangreichen Einsatz von Beispielen und Übungen ist dies auch ein hervorragendes Lehrbuch für Studierende der Graduate School in den unterschiedlichsten Bereichen wie Wirtschaft, Statistik, Sozialwissenschaften, Wirtschaft und Politik. Geweke, J .. Bonnen, J. Koshel, J. amp White, 1999, Samen der Samen: Informieren der öffentlichen Ordnung in der Wirtschaftsforschungsstelle von USDA. . Nationale Akademie-Presse, Washington. Geweke, J .. Berry, D. amp Chaloner, K. 1996, Bayessche Statistik und Ökonometrie: Essays zu Ehren von Arnold Zellner. Wiley, New York. Geweke, J .. Caines, P. Parzen, M. amp Taqqu, M. 1993, Neue Richtungen in der Zeitreihenanalyse, Teile I und II. . Springer-Verlag, New York. Geweke, J. 1992, Decision Making unter Risiko und Unsicherheit: Neue Modelle und empirische Ergebnisse. . Kluwer Academic Publishers, Dordrecht. Geweke, J. Barnett, W. amp Shell, K. 1989, Ökonomische Komplexität: Chaos, Sonnenflecken, Bubbles und Nichtlinearität. Universität von Cambridge. Geweke, J. 1985, Schlussfolgerung der Haushaltsnachfrage für Gebrauchsgüter mit heterogenen Präferenzen: Eine Fallstudie. Geweke, J. 1984, Kapitel 19 Inferenz und Kausalität in ökonomischen Zeitreihenmodellen. (G. amp Xu, H. 2015, Bayesian Inferenz für logistische Regressionsmodelle unter Verwendung sequenzieller posteriorer Simulationen in Upadhyay, S. Singh, U. Dey, D. amp Loganathan, A. ( Eds), Aktuelle Trends in der Bayesschen Methodik mit Anwendungen. CRC Press, USA, S. 290-310. (Hrsg.), Hrsg., Hrsg., Hrsg., Hrsg., Hrsg., G. amp Geweke, J. 2014, Adaptive Sequenzielle Nachfolger Simulatoren für massiv parallele Rechenumgebungen in Jeliazkov, Emerald Group Publishing Limited, USA, S. 1-44. ViewDownload von: UTS OPUS oder Publishers site Massiv parallele Desktop-Computing-Fähigkeiten jetzt in der Reichweite von einzelnen Akademikern verändern die Umgebung für die hintere Simulation in fundamentalen und potenziell sehr vorteilhaften Weisen. Um diese Vorteile voll auszuschöpfen, sind jedoch Algorithmen erforderlich, die mit parallelen Rechenumgebungen konform sind. Dieses Papier präsentiert einen sequentiellen hinteren Simulator entwickelt, um effizient arbeiten in diesem Zusammenhang. Der Simulator macht weniger analytische und programmatische Anforderungen an die Ermittler und ist schneller, zuverlässiger und vollständiger als herkömmliche hintere Simulatoren. Das Papier erweitert bestehende sequentielle Monte Carlo Methoden und Theorie, um eine gründliche und praktische Grundlage für sequentielle posterior Simulation, die gut geeignet ist, um massiv parallele Computing-Umgebungen. Es bietet detaillierte Empfehlungen für die Umsetzung, die einen Algorithmus, der nur Code für die Simulation aus der vorherigen und Bewertung der vorherigen und Daten Dichten und arbeitet gut in einer Vielzahl von Anwendungen, die ernsthafte empirische Arbeit in Wirtschaft und Finanzen. Der Algorithmus erleichtert den bayesischen Modellvergleich, indem es annähernde Annäherungen von unerwarteter Genauigkeit als zufälliges Nebenprodukt erzeugt, robust gegenüber pathologischen posterioren Verteilungen ist und Schätzungen des numerischen Standardfehlers und der relativen numerischen Effizienz an sich bereitstellt. Das Papier schließt mit einer Anwendung, die das Potenzial dieser Simulatoren für angewandte Bayesian Inferenz illustriert. (Hrsg.), H. (Hrsg.), H. (Hrsg.), H. (Hrsg. Oxford University Press, Oxford, Seiten 1-8. ViewDownload from: UTS OPUS Bayesianische ökonometrische Methoden haben in den letzten Jahren eine zunehmende Popularität erlebt. Ökonometriker, empirische Ökonomen und politische Entscheidungsträger nutzen zunehmend bayesianische Methoden. Dieses Handbuch ist eine einzige Quelle für Forscher und Politiker, die über Bayes'sche Methoden in Spezialgebieten lernen wollen, und für Absolventen, die den letzten Schritt vom Schulbuchlernen zur Forschungsgrenze machen wollen. Es enthält Beiträge der führenden Bayesianer über die neuesten Entwicklungen in ihren spezifischen Fachgebieten. Das Volumen bietet eine breite Abdeckung der Anwendung der Bayesianischen Ökonometrie in den wichtigsten Bereichen der Ökonomie und verwandten Disziplinen, einschließlich Makroökonomie, Mikroökonomie, Finanzen und Marketing. Es überprüft den Stand der Technik in Bayesian ökonometrischen Methodik, mit Kapiteln auf hintere Simulation und Markov Kette Monte Carlo Methoden, Bayesian nichtparametrische Techniken und die spezialisierten Werkzeuge von Bayesian Zeitreihen Ökonometriker wie Staatsraum-Modelle und Partikelfilterung verwendet. Es enthält auch Kapitel über Bayes'sche Prinzipien und Methodik. Geweke, J. 2009, Das SETAR Modell von Tong und Lim und Fortschritte in der Berechnung in Chan, K. S. (Ed), Erforschung einer nicht-linearen Welt: Eine Anerkennung von Howell Tongs Beiträge zur Statistik. . World Scientific, Singapore, S. 85-94. ViewDownload from: UTS OPUS Diese Diskussion rekonstruiert Tong und Lim039s Seminal 1980 Papier auf dem SETAR-Modell im Rahmen der Fortschritte in der Berechnung seit dieser Zeit. Unter Verwendung des kanadischen Lynx-Datensatzes aus diesem Papier verglichen sie exakte Maximal-Likelihood-Schätzungen mit denen in dem ursprünglichen Papier. Es veranschaulicht die Anwendung der Bayes'schen MCMC-Methoden, die in der Zwischenzeit entwickelt wurden, zu diesem Modell und Datensatz. Es zeigt, dass SETAR ein limitierender Fall der Mischung von Expertenmodellen ist und die Anwendung einer Variante dieser Modelle auf den Lynx-Datensatz untersucht. Die Anwendung ist erfolgreich, trotz der geringen Größe des Datensatzes und der Komplexität des Modells. Predictive Likelihood-Verhältnisse begünstigen Tong und Lim039s Original-Modell. Geweke, J .. Horowitz, J. L. amp Pesaran, H. 2008, Ökonometrie in Durlauf, S. N. Amp Blume, L. E. (Eds), The New Palgrave Wörterbuch der Wirtschaftswissenschaften online. Palgrave Macmillan, Online, S. 1-32. ViewDownload from: UTS OPUS oder Publishers site Als einheitliche Disziplin ist Ökonometrie noch relativ jung und hat sich sehr schnell verändert und erweitert. Wesentliche Fortschritte wurden bei der Analyse von Querschnittsdaten mittels semiparametrischer und nichtparametrischer Verfahren durchgeführt. Die Heterogenität der Wirtschaftsbeziehungen zwischen Einzelpersonen, Unternehmen und Industrien wird zunehmend anerkannt, und es wurden Versuche unternommen, sie entweder durch Integration ihrer Wirkungen oder durch Modellierung der Heterogenitätsquellen zu berücksichtigen, wenn geeignete Paneldaten vorliegen. Die kontrafaktischen Erwägungen, die der Politikanalyse und der Behandlungsbewertung zugrunde liegen, wurden zufriedenstellender. Neue ökonometrische Methoden der Zeitreihe wurden in den Bereichen Makroökonometrie und Finanzen umfassend entwickelt und eingesetzt. Nichtlineare ökonometrische Verfahren werden zunehmend bei der Analyse von Querschnitts - und Zeitreihen-Beobachtungen verwendet. Anwendungen der Bayesschen Techniken zu ökonometrischen Problemen wurden weitgehend durch Fortschritte in der Computerleistung und in den Rechentechniken gefördert. Die Verwendung von Bayes'schen Techniken hat den Forschern wiederum einen einheitlichen Rahmen gegeben, in dem die Aufgaben der Prognose, der Entscheidungsfindung, der Modellbewertung und des Lernens als Teile desselben interaktiven und iterativen Prozesses betrachtet werden können, wodurch eine Grundlage für eine ökonometrische Ampullenrekonstruktion geschaffen wird. Keane, M. amp Geweke, J. 2006, Bayessche Querschnittsanalyse der bedingten Verteilung des Erwerbs von Männern in den USA (1967-1996) in Upadhyay, S. K. Singh, U. amp Dey, D. K. (Hrsg.), Bayessche Statistik und ihre Anwendungen. Anshan Ltd, New Delhi, S. 160-197. ViewDownload von: UTS OPUS Geweke, J. amp Whiteman, C. 2006, Bayessche Vorhersage in Elliot, G. Granger, C. W.J. Amp Timmerman, A. (Hrsg.), Handbuch der Wirtschaftsprognose. Elsevier, Niederlande, Seiten 3-80. ViewDownload von: UTS OPUS oder Publishers Website Bayesianische Prognose ist ein natürliches Produkt eines Bayes'schen Ansatzes zur Schlussfolgerung. Der Bayessche Ansatz erfordert im allgemeinen eine explizite Formulierung eines Modells und eine Konditionierung auf bekannten Größen, um Rückschlüsse auf unbekannte zu ziehen. In der Bayes'schen Prognose nimmt man einfach eine Teilmenge der unbekannten Größen in zukünftige Werte einiger Variablen von Interesse. Dieses Kapitel stellt die Grundlagen der Bayesschen Prognose vor und beschreibt die jüngsten Fortschritte in der Rechenfähigkeit, um sie anzuwenden, die den Anwendungsbereich des Bayesschen Ansatzes dramatisch erweitert haben. Es beschreibt historische Entwicklungen und analytische Kompromisse, die vor der jüngsten Entwicklung notwendig waren, die Anwendung der neuen Verfahren in einer Vielzahl von Beispielen und Berichte über zwei langfristige Bayessche Vorhersageübungen Geweke, J. Houser, D. E. Amp Keane, M. 2001, Simulationsbasierte Inferenz für dynamische Multinomialauswahlmodelle in einem Begleiter zu Theoretischer Ökonometrie Ein Begleiter zu Theoretischer Ökonometrie Ein Begleiter zu Theoretischer Ökonometrie. Blackwell Publishing, USA, S. 466-493. Geweke, J. amp Keane, M. 2001, Berechnungsintensive Methoden zur Integration in die Ökonometrie im Handbuch der Ökonometrie Handbuch der Ökonometrie Dieses Dokument Abstract Actions Zitiert von Als Citation Alarm speichern E-Mail Artikel Export Citation H. Elsevier Inc, North Holland, S. 3463-3568. Geweke, J. 2001, Einbetten Bayessche Werkzeuge in Mathematische Software in George, E. I. (Hrsg.), Bayessche Methoden mit Anwendungen für Wissenschaft, Politik und amtliche Statistik. . Eurostat, Brüssel, S. 165-174. Die BACC-Software bietet seinen Anwendern Werkzeuge für Bayesian Analysis, Computation and Communications. Diese Werkzeuge sind in mathematische Softwareanwendungen wie Matlab und Gauß eingebettet. Aus der Userampacircs-Perspektive gibt es eine nahtlose Integration von speziellen BACC-Befehlen mit den leistungsfähigen eingebauten Befehlen der Anwendung. Mehrere Modelle sind derzeit verfügbar, und BACC ist entworfen, um erweiterbar zu sein. Wir geben einen kurzen Einblick in die Anwendung von BACC für Matlab und diskutieren die Umsetzung neuer Modelle für BACC. Geweke, J. amp Keane, M. 2001, Kapitel 56 Berechnungsintensive Methoden zur Integration in die Ökonometrie, S. 3463-3568. ViewDownload von: Verleger-Site Bis vor kurzem wurde Schlußfolgerung in vielen interessanten Modellen durch die Anforderung einer hochdimensionalen Integration ausgeschlossen. Aber dramatische Zunahmen der Computergeschwindigkeit und die jüngste Entwicklung neuer Algorithmen, die eine genaue Monte-Carlo-Bewertung von hochdimensionalen Integralen erlauben, haben das Spektrum der Modelle, die berücksichtigt werden können, erheblich erweitert. Dieses Kapitel stellt die Methodik für mehrere der wichtigsten Monte-Carlo-Methoden vor, ergänzt durch eine Reihe konkreter Beispiele, die zeigen, wie die Methoden verwendet werden. Einige Beispiele sind neu in der Ökonometrie. Sie schließen Schlußfolgerungen in multinomialen diskreten Wahlmodellen und Auswahlmodellen ein, in denen die Normalnormalitätsannahme zugunsten einer multivariaten Mischung von Normalenannahme entspannt wird. Mehrere Monte Carlo Experimente zeigen, dass diese Methoden bei der Identifizierung von Abweichungen von der Normalität erfolgreich sind, wenn sie vorhanden sind. Im Mittelpunkt des Kapitels steht die Schlussfolgerung in parametrischen Modellen, die eine starke Variation der Störungsverteilung ermöglichen. Das Kapitel behandelt zuerst Monte-Carlo-Methoden für die Auswertung hochdimensionaler Integrale, einschließlich integraler Simulatoren wie der GHK-Methode und Markov Chain Monte Carlo Methoden wie Gibbs Sampling und dem Metropolis-Hastings Algorithmus. Anschließend werden Methoden zur Approximierung von Lösungen für diskrete Auswahl dynamischer Optimierungsprobleme, einschließlich der von Keane und Wolpin und Rust entwickelten Methoden sowie Methoden zur vollständigen Umgehung des Integrationsproblems, wie der Ansatz von Geweke und Keane, angewendet. Der Rest des Kapitels befasst sich mit konkreten Beispielen: klassische Simulationsschätzung für multinomiale Probit-Modelle sowohl im Querschnitts - als auch im Panel-Datenkontext univariate und multivariate latente lineare Modelle und Bayessche Inferenz in dynamischen diskreten Wahlmodellen, in denen die zukünftige Komponente der Wertfunktion steht Wird durch ein flexibles Polynom ersetzt. Ampcopy 2001 Elsevier Inc. Alle Rechte vorbehalten. (Hrsg.), T. amp Weeks, M. J. (Hrsg.), Simulationsbasierte Inferenz in der Ökonometrie: Methoden und Anwendungen. Cambridge University Press, Cambridge, S. 255-299. (Hrsg.), Wolfgang Schuermann amp Weeks (eds), Simulationsbasierte Inferenz in der Ökonometrie: Methoden und Anwendungen. Cambridge University Press, UK, S. 100-131. Geweke, J. 1999, Simulationsverfahren für Modellkritik und Robustheitsanalyse in Berger, J. O. Bernado, J. M. Dawid, A. P. amp Smith, A. F.M. (Hrsg.), Bayessche Statistik 6. Oxford University Press, Oxford, UK, S. 275-299. Abstract: Dieses Papier zeigt und entwickelt Bayesian Methoden der Modell-Kritik und Robustheit Analyse. Ziel ist es, die bayesianische Interpretation nicht-bayesischer Diagnosetests zu klären und explizit bayesische Verfahren für praktische Forscher zugänglich zu machen. Spezielle Methoden zur vorherigen Dichtekritik und Robustheitsanalyse sowie Kritik der Datendichte werden vorgestellt. Alle basieren auf der Approximation geeigneter Bayes-Faktoren und vermeiden die Notwendigkeit einer posterioren Simulation unter alternativen Modellspezifikationen. Es wird eine allgemeine Methode der Datensichtkritik entwickelt, die weder hintere Simulation noch analytische Approximationen unter jeder Modellspezifikation erfordert. Einige der hier vorgestellten Methoden wurden in einer benutzerorientierten Software implementiert. Das Papier zeigt einige einfache Darstellungen der Methoden. Geweke, J. amp Keane, M. 1999, Mischung von Probenmodellen in Hsiao, C. Lahiri, K. Lee, L. F. amp Pesaran, M. H. (Eds), Analyse von Panels und begrenzten abhängigen Variablen: Ein Volumen in Ehren von G. S. Maddala. Cambridge University Press, Cambridge, S. 49-78. Zusammenfassung: Dieses Papier verallgemeinert das Normalproblemmodell der dichotomen Wahl durch Einführung von Mischungen von Normalverteilungen für den Störungsterm. Durch das Mischen sowohl des Mittelwertes als auch der Varianzparameter und durch Erhöhung der Anzahl der Verteilungen in der Mischung entfernen diese Modelle effektiv die Normalitätsannahme und sind viel näher bei semiparametrischen Modellen. Wenn ein Bayesschen Ansatz angewendet wird, gibt es eine exakte endliche Stichprobenverteilungstheorie für die Auswahlwahrscheinlichkeit, die von den Kovariaten abhängig ist. Das Papier verwendet künstliche Daten, um zu zeigen, wie hintere Quotenverhältnisse zwischen normalen und nichtnormalen Verteilungen in Probit-Modellen unterscheiden können. Die Methode wird auch auf weibliche Erwerbsbeteiligungsentscheidungen in einer Stichprobe mit 1.555 Beobachtungen der PSID angewendet. In dieser Anwendung favorisieren die Bayes-Faktoren eine Mischung von Normal-Probit-Modellen gegenüber dem herkömmlichen Probit-Modell stark, und die am meisten bevorzugten Modelle haben Mischungen von vier Normalverteilungen für den Störungsterm. Geweke, J. 1999, Einige Experimente zur Konstruktion eines Hybridmodells für makroökonomische Analysen: Ein Kommentar in McCallum, B. (ed), Carnegie-Rochester-Konferenzreihe zur öffentlichen Politik. Elsevier, S. 143-147. Geweke, J. 1997, Posterior Simulators in Ökonometrie in Kreps, D. M. Amp Wallis, K. F. (Hrsg.), Advances in Economics and Econometrics: Theorie und Anwendungen. Cambridge University Press, Cambridge, S. 128-165. Zusammenfassung: Die Entwicklung der hinteren Simulatoren in den letzten zehn Jahren hat Überzeugungen über die oben genannten drei Vorschläge von vielen Ökonometrieren, die diese Entwicklungen genau verfolgt haben, revidiert. Ziel dieses Beitrags ist es, diese Innovationen und ihre Bedeutung für angewandte Ökonometrie, Ökonometriker, die die entsprechende mathematische und angewandte Literatur nicht eingehalten haben, zu vermitteln. Es gibt vier wesentliche Abschnitte. Ein Abschnitt behandelt Aspekte der Bayesschen Schlussfolgerung, die für das Verständnis der Implikationen hinterer Simulatoren für die Bayessche Ökonometrie von wesentlicher Bedeutung sind. Ein weiterer Abschnitt beschreibt diese Simulatoren und liefert die wesentlichen Konvergenz-Ergebnisse. Implikationen dieser Verfahren für einige ausgewählte ökonometrische Modelle sind in einem dritten Abschnitt gezeichnet. Dies wird durchgeführt, um den Bereich von Aufgaben anzuzeigen, denen hintere Simulatoren gut geeignet sind, anstatt eine repräsentative Übersicht über die jüngste Bayes'sche ökonometrische Literatur zur Verfügung zu stellen. Schließlich wird das Papier zu einigen Implikationen für den Modellvergleich und für die Kommunikation zwischen denjenigen, die angewendete Arbeit und ihre Zielgruppe, die beginnen, aus der Verwendung von hinteren Simulatoren in der Bayesianischen Ökonometrie entstehen entstehen. Geweke, J. 1996, Bayessche Inferenz für lineare Modelle unterliegen linearen Ungleichheiten Einschränkungen in Modellierung und Vorhersage - zu Ehren von Seymour Geisser. Springer-Verlag, New York, S. 248-263. Zusammenfassung: Das normale lineare Modell, mit Vorzeichen oder anderen linearen Ungleichungsbeschränkungen für seine Koeffizienten, entsteht sehr häufig in vielen wissenschaftlichen Anwendungen. Angesichts der Ungleichungsbeschränkungen ist die Bayessche Schlussfolgerung viel einfacher als die klassische Schlußfolgerung, aber die standardmäßigen Bayesschen Berechnungsmethoden werden unpraktisch, wenn die hintere Wahrscheinlichkeit der Ungleichheitsbeschränkungen (unter einem diffusen vorherigen) klein ist. Dieses Papier zeigt, wie der Gibbs-Sampling-Algorithmus in dieser Situation einen alternativen, attraktiven Ansatz bietet, der in dieser Situation linearen Ungleichheitsbeschränkungen unterliegt, und wie der GHK-Wahrscheinlichkeitssimulator verwendet werden kann, um die hintere Wahrscheinlichkeit der Einschränkungen zu bewerten. Geweke, J. 1996, Monte Carlo Simulation und Numerische Integration in Amman, H. M. Kendrick, D. A. Amp Rust, J. (Hrsg.), Handbuch für Computational Economics. Elsevier, Seiten 731-800. Zusammenfassung: Dies ist eine Übersicht der Simulationsmethoden in der Ökonomie mit einem speziellen Fokus auf Integrationsprobleme. Es beschreibt Akzeptanzmethoden, wichtige Stichprobenverfahren und Markov-Kette Monte Carlo Methoden zur Simulation aus univariaten und multivariaten Verteilungen und deren Anwendung auf die Approximation von Integralen. Die Ausstellung legt Wert auf Kombinationen verschiedener Ansätze und die Beurteilung der Genauigkeit numerischer Approximationen auf Integrale und Erwartungen. Die Studie veranschaulicht diese Verfahren mit Anwendungen für Simulations - und Integrationsprobleme in der Wirtschaft. Geweke, J. 1996, Variable Selection und Modellvergleich in Regression in Bernardo, J. M. Berger, J. O. Dawid, A. P. amp Smith, A. F.M. (Hrsg.), Bayessche Statistik 5. Oxford University Press, Oxford, S. 609-620. Zusammenfassung: In der Spezifikation von linearen Regressionsmodellen ist es üblich, eine Liste von Kandidatenvariablen anzugeben, von denen eine Teilmenge in das Modell mit Koeffizienten ungleich Null eintritt. Dieses Papier interpretiert diese Spezifikation als gemischte kontinuierliche diskrete vorherige Verteilung für Koeffizientenwerte. Es verwendet dann einen Gibbs-Sampler, um hintere Momente zu konstruieren. Es wird gezeigt, wie diese Methode Zeichenbegrenzungen einbeziehen und hintere Wahrscheinlichkeiten für alle möglichen Untergruppen von Regressoren liefern kann. Die Verfahren werden unter Verwendung einiger Standarddatensätze veranschaulicht. Geweke, J. 1996, Variable Auswahlversuche von Asset-Pricing-Modellen in Gatsonis, C. Hodges, J. S. Kass, R. E. McCulloch, R. E. Rossi, P. amp Singpurwalla, N. D. (Hrsg.), Fallstudien in Bayesschen Statistiken (Band 3). Springer-Verlag, New York. Geweke, J. 1996, Kapitel 15 Monte Carlo Simulation und numerische Integration, S. 731-800. ViewDownload from: Verlegerwebsite Geweke, J. 1993, Inferenz und Prognose für chaotische nichtlineare Zeitreihen in Day, R. H. amp Chen, P. (eds), Nichtlineare Dynamik und Evolutionsökonomie. Oxford Universität Presse, Oxford. Geweke, J. 1993, Ein dynamisches Indexmodell für große Querschnitte auf Lager, J. amp Watson, M. (Hrsg.), Neue Forschungen über Konjunktur, Indikatoren und Prognosen. Nationales Büro für Wirtschaftsforschung, New York. Geweke, J. 1992, Bewertung der Genauigkeit von Stichprobenbasierten Ansätzen zur Berechnung der posterioren Momente in Bernardo, J. M. Berger, J. O. Dawid, A. P. amp Smith, A. F.M. (Hrsg.), Bayessche Statistik 4. Oxford University Press, Oxford, S. 169-194. Zusammenfassung: Datenvermehrung und Gibbs-Sampling sind zwei eng verwandte, auf Stichproben basierende Ansätze zur Berechnung der hinteren Momente. Die Tatsache, dass jede eine Probe erzeugt, deren Bestandteile weder unabhängig noch identisch verteilt sind, kompliziert die Beurteilung der Konvergenz und der numerischen Genauigkeit der Annäherungen an den erwarteten Wert von Funktionen, die unter dem posterioren Interesse sind. In dieser Arbeit werden Methoden der Spektralanalyse verwendet, um die numerische Genauigkeit formal zu bewerten und eine Diagnose für die Konvergenz zu konstruieren. Diese Methoden werden im normalen linearen Modell mit informativen Priors und im Tobit-censored Regressionsmodell dargestellt. (1983), in: W.-W., W.-G., W.-G., W.-G., W.-P. Eds), Gleichgewichtstheorie und Anwendungen. Cambridge University Press, Cambridge, S. 425-480. W. amp Yue, P. 1991, Semiparametrische Bayessche Schätzung des asymptotisch idealen Modells: Das AIM-Demandsystem in Barnett, W. A. ​​Powell, J. amp Tauchen, G. E. (Eds), nichtparametrische und semiparametrische Methoden in Ökonometrie und Statistik. Cambridge University Press, Cambridge, S. 127-174. Geweke, J. amp Shell, K. (Hrsg.), Ökonomische Komplexität: Chaos, Sonnenflecken, Bubbles und Nichtlinearität. Cambridge University Press, Cambridge, S. 337-360. Zusammenfassung: Polynomische Erweiterungen der Normalwahrscheinlichkeitsdichtefunktion werden als eine Klasse von Modellen für nicht beobachtete Komponenten vorgeschlagen. Operative Verfahren für die Bayessche Schlußfolgerung in diesen Modellen werden ebenso entwickelt wie Methoden zur Kombination einer Folge solcher Modelle und Bewertung der Hypothesen von Normalität und Symmetrie. Die Beiträge dieses Kapitels werden mit einem Antrag auf tägliche Wechselkursänderungen illustriert. Geweke, J. 1988, Exakte Inferenz in Modellen mit autoregressiver bedingter Heteroskedastizität in Barnett, W. A. ​​Berndt, E. R. amp White, H. (Hrsg.), Dynamic Econometric Modeling. Cambridge University Press, Cambridge, pp. 73-103. (1977), New York (New York), New York (New York, New York, New York, New York, New York, New York, New York, New York, New York). . Die Macmillan Presse, London. Geweck, J. 1984, Inferenz und Kausalität in ökonomischen Zeitreihenmodellen in Griliches, Z. amp Intriligator, M. D. (eds), Handbuch der Ökonometrie, Band 2. Nord-Holland, Amsterdam, S. 1101-1144. Geweke, J. 1983, Kausalität, Exogenität und Inferenz in Hildenbrand, D. (ed.), Advances in Econometrics. Cambridge University Press, Cambridge, S. 209-236. Geweke, J. 1982, Rückkopplung zwischen Geldpolitik, Arbeitsmarktaktivität und Lohninflation in den Vereinigten Staaten, 1955-1978 in Arbeiter, Arbeit und Inflation. Brookings Institution Press, Washington, S. 159-198. Geweke, J. amp Weisbrod, B. 1980, Einige wirtschaftliche Konsequenzen des technologischen Fortschritts in der medizinischen Versorgung: Der Fall eines neuen Medikaments in Helms, R. (ed), Drugs and Health. Amerikanisches Unternehmensinstitut, Washington. Geweke, J. amp Dent, W. 1980, On Spezifikation in Simultan-Gleichungsmodellen in Kmenta, J. amp Ramsey, J. B. (Hrsg.), Evaluation of Econometric Models. Elsevier Science amp Technology Bücher, S. 169-196. Geweke, J. 1978, Die zeitliche und sektorale Aggregation saisonbereinigter Zeitreihen in Zellner, A. (Hrsg.), Saisonanalyse der ökonomischen Zeitreihen. Regierungsdruckerei, Washington, US, S. 411-432. Zusammenfassung: Es werden Verfahren für die optimale saisonale Anpassung von ökonomischen Zeitreihen und deren Aggregation abgeleitet, wobei ein Kriterium für die Anpassung der im politischen oder journalistischen Kontext verwendeten Daten verwendet wird. Es wird gezeigt, dass die Daten gemeinsam angepasst und dann zeitlich oder sektoriell aggregiert werden sollten, wie es gewünscht wird, ein Verfahren, das lineare Aggregationsidentitäten bewahrt. Die Prüfung der tatsächlichen ökonomischen Zeitreihen deutet darauf hin, dass die optimale saisonale Anpassung und Aggregation von Daten eine wesentliche Verbesserung der Qualität der sektoriell disaggregierten, angepassten Daten ermöglichen und die erforderliche nachfolgende Überarbeitung der aktuell angepassten Serien erheblich reduziert. Geweke, J. 1978, The Revision of Seasonally Adjusted Time Series in 1978 Proceedings of the Business and Economic Statistics Section - American Statistical Association . pp. 320-325. Geweke, J. 1977, The Dynamic Factor Analysis of Economic TIme Series Model in Aignew, D. amp Goldberger, A. (eds), Latent Variables in Socioeconomic Models . AMsterdam: North-Holland, pp. 365-383. Geweke, J. 1977, Wage and Price Dynamics in U. S. Manufacturing in New Methods in Business Cycle Research . Federal Researve Bank of Minneapolis, Minneapolis, MN USA, pp. 111-158. Conferences Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. J. 2011, Economic rationality, risk presentation, and retirement portfolio choice, Financial Management Association Annual Meeting, Denver, USA. Geweke, J. 2010, Complete and Incomplete Bayesian Models for Financial Time Series, JSM Proceedings, Section on Bayesian Statistical Science . American Statistical Association, Vancouver, Canada. This paper introduces the idea of an incomplete Bayesian model, which is a (possibly incoherent) prior predictive distribution for sample moments. Conventional complete Bayesian models also provide prior distributions for sample moments and consequently formal comparison of completely and incomplete models can be conducted by means of posterior odds ratios. This provides a logically consistent and workable Bayesian alternative to non-Bayesian significance tests and is an effective tool in the process of model development. These ideas are illustrated using three well-known alternative models for monthly SampP 500 index returns. Geweke, J. 2000, Bayesian Communication: The BACC System, 2000 Proceedings of the Section on Bayesian Statistical Sciences - American Statistical Association . pp. 40-49. Geweke, J.. Keane, M. amp Runkle, D. 1994, Recursively Simulating Multinomial Multiperiod Probit Probabilities, American Statistical Association 1994 Proceedings of the Business and Economic Statistics Section. . Geweke, J. amp Terui, N. 1991, Threshold Autoregressive Models for Macroeconomic Time Series: A Bayesian Approach, American Statistical Association 1991 Proceedings of the Business and Economic Statistics Section . American Statistical Association, pp. 42-50. Geweke, J. 1991, Efficient Simulation from the Multivariate Normal and Student-t DistribuAmerican tions Subject to Linear Constraints, Computing Science and Statistics: Proceedings of the Twenty-Third Symposium on the Interface . Interface Foundation of North America, Fairfax, pp. 571-578. The following routines constitute the software for the paper, quotEfficient Simulation from the Multivariate normal and Student-t Distributions Subject to Linear Constraints and the Evaluation of Constraint Probabilities, quot by John Geweke. This paper is to appear in the volume, quotComputing Science and Statistics: Proceedings of the Twenty-Third Symposium on the Interface. quot This work was supported by NSF Grant SES-8908365. Geweke, J. 1989, The Posterior Distribution of Roots in Multivariate Autoregressions, Erasmus University of Rotterdam - Econometric Institute. Geweke, J. 1989, Acceleration Methods for Monte Carlo Integration in Bayesian Inference, American Statistical Society, Alexandria, pp. 587-592. Abstract: Methods for the acceleration of Monte Carlo integration with n replications in a sample of size T are investigated. A general procedure for combining antithetic variation and grid methods with Monte Carlo methods is proposed, and it is shown that the numerical accuracy of these hybrid methods can be evaluated routinely. The derivation indicates the characteristics of applications in which acceleration is likely to be most beneficial. This is confirmed in a worked example, In which these acceleration methods reduce the computation time required to achieve a given degree of numerical Accuracy by several orders of magnitude. Geweke, J. 1986, Fixed Investment in the American Business Cycle, The American Business Cycle: Continuity and Change . National Bureau of Economic Research, New York. Geweke, J. 1983, Semi-Nonparametric and Nonparametric Regression: Consumer Demand Applications, Proceedings of the Business and Economic Section - American Statistical Association . American Statistical Association, USA. Geweke, J. 1983, Models of X-11 and X-11 Forecast Procedures, Applied Time Series Analysis of Economic Data . U. S. Bureau of the Census, Washington, pp. 12-13. Geweke, J. 1982, New Divisia Indices of the Money Supply, Proceedings of the Business and Economics Section . American Statistical Association. Geweke, J. 1978, On the Synthesis of Time Series and Econometric Models, Directions in Time Series . Institute of Mathematical Statistics. Geweke, J. 1978, Some Recent Developments in Seasonal Adjustment, Directions in Time Series . Institute of Mathematical Statistics. Journal articles Bateman, H. Eckert, C. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. J. 2016, Risk presentation and retirement portfolio choice, Review of Finance . vol. 20, no. 1, pp. 201-229. ViewDownload from: Publishers site Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. Satchell, S. amp Thorp, S. 2016, Risk Presentation and Portfolio Choice, Review of Finance . vol. 20, no. 1, pp. 201-229. ViewDownload from: UTS OPUS or Publishers site Efficient investment of personal savings depends on clear risk disclosures. We study the propensity of individuals to violate some implications of expected utility under alternative 039mass-market descriptions of investment risk, using a discrete choice experiment. We found violations in around 25 of choices, and substantial variation in rates of violation, depending on the mode of risk disclosure and participants039 characteristics. When risk is described as the frequency of returns below or above a threshold we observe more violations than for range and probability-based descriptions. Innumerate individuals are more likely to violate expected utility than those with high numeracy. Apart from the very elderly, older individuals are less likely to violate the restrictions. The results highlight the challenges of disclosure regulation. Geweke, J. 2016, Comment on: Reflections on the probability space induced by moment conditions with implications for Bayesian inference, Journal of Financial Econometrics . vol. 14, no. 2, pp. 253-257. ViewDownload from: Publishers site Geweke, J. 2016, Sequentially Adaptive Bayesian Learning for a Nonlinear Model of the Secular and Cyclical Behavior of US Real GDP, Econometrics . vol. 4, no. 1, pp. 10-10. There is a one-to-one mapping between the conventional time series parameters of a third-order autoregression and the more interpretable parameters of secular half-life, cyclical half-life and cycle period. The latter parameterization is better suited to interpretation of results using both Bayesian and maximum likelihood methods and to expression of a substantive prior distribution using Bayesian methods. The paper demonstrates how to approach both problems using the sequentially adaptive Bayesian learning algorithm and sequentially adaptive Bayesian learning algorithm (SABL) software, which eliminates virtually of the substantial technical overhead required in conventional approaches and produces results quickly and reliably. The work utilizes methodological innovations in SABL including optimization of irregular and multimodal functions and production of the conventional maximum likelihood asymptotic variance matrix as a by-product. Durham, G. Geweke, J. amp Ghosh, P. 2015, A comment on Christoffersen, Jacobs, and Ornthanalai (2012), Dynamic jump intensities and risk premiums: Evidence from SampP 500 returns and options, JOURNAL OF FINANCIAL ECONOMICS . vol. 115, no. 1, pp. 210-214. ViewDownload from: Publishers site Geweke, J. amp Petrella, L. 2014, Likelihood-based Inference for Regular Functions with Fractional Polynomial Approximations, Journal of Econometrics . vol. 183, no. 1, pp. 22-30. The paper demonstrates limitations in previous work using MampAtildeampfrac14ntz-Szatz polynomial approximations for regular functions. It introduces an alternative set of fractional polynomial approximations not subject to these limitations. Using Weierstrass approximation theory it shows that the set of fractional polynomial approximations is dense on a Sobolev space of functions on a com-pact set. Imposing regularity conditions directly on the fractional polynomi-als produces pseudo-true approximations that converge rapidly to productions functions having no exact representation as fractional polynomials. A small Monte Carlo study recovers this convergence in ampAcircnite sample, and the results are promising for future development of an adequate sampling-theoretic distribution theory. Frischknecht, B. D. Eckert, C.. Geweke, J. amp Louviere, J. J. 2014, A Simple Method to Estimate Preference Parameters for Individuals, International Journal of Research in Marketing . vol. 31, pp. 35-48. ViewDownload from: Publishers site Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. J. 2014, Financial competence, risk presentation and retirement portfolio preferences, Journal of Pension Economics and Finance . vol. 13, no. 1, pp. 27-61. ViewDownload from: Publishers site Geweke, J. amp Amisano, G. 2014, Analysis of variance for Bayesian inference, Econometric Reviews . vol. 33, no. 1-4, pp. 270-288. ViewDownload from: Publishers site Durham, G. amp Geweke, J. 2014, Adaptive sequential posterior simulators for massively parallel computing environments, Advances in Econometrics . vol. 34, pp. 1-44. ViewDownload from: Publishers site Copyright ampcopy 2014 by Emerald Group Publishing Limited. Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference. Durham, G. amp Geweke, J. 2014, Improving Asset Price Prediction When All Models are False, Journal of Financial Econometrics . vol. 12, no. 2, pp. 278-306. ViewDownload from: Publishers site Geweke, J. 2014, Review Essay on Charles F. Manskis Public Policy in an Uncertain World: Analysis and Decisions, JOURNAL OF ECONOMIC LITERATURE . vol. 52, no. 3, pp. 799-804. ViewDownload from: Publishers site Bateman, H. Eckert, C.. Geweke, J.. Iskhakov, F. Louviere, J. J. Satchell, S. E. amp Thorp, S. 2013, Disengagement: A Partial Solution to the Annuity Puzzle, UNSW Australian School of Business Research Paper . Nein. 2013. This research studies whether individuals make choices consistent with expected utility maximization in allocating wealth between a lifetime annuity and a phased withdrawal account at retirement. The paper describes the construction and administration of a discrete choice experiment to 854 respondents approaching retirement. The experiment fi nds overall rates of inconsistency with the predictions of the standard CRRA utility model of roughly 50, and variation in consistency rates depending on the characteristics of the respondents. Individuals with poor numeracy and with low engagement with the choice task, as measured by scores on a task-speci fic recall quiz, are more likely to increase allocations to the phased withdrawal as the risk of exhausting it increases. Individuals with higher scores on tests of financial capability and with knowledge of retirement income products are more likely to score high on the engagement measure, but capability and knowledge do not have independent eff ects on consistent choice rates. Results suggest that initiatives to improve speci fic product knowledge and to help individuals engage with decumulation decisions could be a partial solution to the annuity puzzle. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Thorp, S. J. amp Satchell, S. 2012, Financial competence and expectations formation: Evidence from Australia, The Economic Record . vol. 88, no. 280, pp. 39-63. ViewDownload from: UTS OPUS or Publishers site We study the financial competence of Australian retirement savers using self-assessed and quantified measures. Responses to financial literacy questions show large variation and compare poorly with some international surveys. Basic and sophisticated financial literacy vary significantly with most demographics, self-assessed financial competence, income, superannuation accumulation and net worth. General numeracy scores are largely constant across gender, age, higher education and income. Financial competence also significantly affects expectations of stock market performance. Using a discrete choice model, we show that individuals with a higher understanding of risk, diversification and financial assets are more likely to assign a probability to future financial crises rather than expressing uncertainty. Geweke, J.. Koop, G. amp Paap, R. 2012, Introduction for the annals issue of the Journal of Econometrics on Bayesian Models, Methods and Applications, Journal of Econometrics . vol. 171, no. 2, pp. 99-100. ViewDownload from: UTS OPUS or Publishers site This Annals issue of the Journal of Econometrics grew out of the European Seminar on Bayesian Econometrics (ESOBE) which was held at Erasmus University, Rotterdam on November 56, 2010. This conference was important for two reasons. First it inaugurated ESOBE, which has become a successful annual conference which brings European and international Bayesians together. Second, it celebrated the retirement of Herman van Dijk after a long and successful career in Bayesian econometric Geweke, J. amp Amisano, G. 2012, Prediction With Misspecified Models, American Economic Review . vol. 102, no. 3, pp. 482-486. ViewDownload from: UTS OPUS or Publishers site Many decision-makers in the public and private sectors routinely consult the im - plications of formal economic and statistical models in their work. Especially in large organizations and for important decisions, there are often competing models. Of course no model under consideration is a literal representation of reality for the purposes at hand. more succinctly, no model is. true..and diampcurrenerent models focus on diampcurrenerent aspects of the relevant environment. This fact can often be supported by formal econometric tests concluding that the models at hand are, indeed, misspecified in various dimensions. Geweke, J. 2012, Nonparametric Bayesian modelling of monotone preferences for discrete choice experiments, Journal of Econometrics . vol. 171, no. 2, pp. 185-204. ViewDownload from: UTS OPUS or Publishers site Discrete choice experiments are widely used to learn about the distribution of individual preferences for product attributes. Such experiments are often designed and conducted deliberately for the purpose of designing new products. There is a long-standing literature on nonparametric and Bayesian modelling of preferences for the study of consumer choice when there is a market for each product, but this work does not apply when such markets fail to exist as is the case with most product attributes. This paper takes up the common case in which attributes can be quantified and preferences over these attributes are monotone. It shows that monotonicity is the only shape constraint appropriate for a utility function in these circumstances. The paper models components of utility using a Dirichlet prior distribution and demonstrates that all monotone nondecreasing utility functions are supported by the prior. It develops a Markov chain Monte Carlo algorithm for posterior simulation that is reliable and practical given the number of attributes, choices and sample sizes characteristic of discrete choice experiments. The paper uses the algorithm to demonstrate the flexibility of the model in capturing heterogeneous preferences and applies it to a discrete choice experiment that elicits preferences for different auto insurance policies. Geweke, J. amp Amisano, G. 2011, Hierarchical Markov normal mixture models with applications to financial asset returns, Journal of Applied Econometrics . vol. 26, no. 1, pp. 1-29. ViewDownload from: UTS OPUS or Publishers site Abstract: Motivated by the common problem of constructing predictive distributions for daily asset returns over horizons of one to several trading days, this article introduces a new model for time series. This model is a generalization of the Markov normal mixture model in which the mixture components are themselves normal mixtures, and it is a specific case of an artificial neural network model with two hidden layers. The article characterizes the implications of the model for time series in two ways. First, it derives the restrictions placed on the autocovariance function and linear representation of integer powers of the time series in terms of the number of components in the mixture and the roots of the Markov process. Second, it uses the prior predictive distribution of the model to study the implications of the model for some interesting functions of asset returns. The article uses the model to construct predictive distributions of daily SampP 500 returns 1971-2005, US dollar -- UK pound returns 1972-1998, and one - and ten-year maturity bonds 1987-2006. It compares the performance of the model for these returns with ARCH and stochastic volatility models using the predictive likelihood function. The model039s performance is about the same as its competitors for the bond returns, better than its competitors for the SampP 500 returns, and much better than its competitors for the dollar-pound returns. In - and out-of-sample validation exercises with predictive distributions identify some remaining deficiencies in the model and suggest potential improvements. The article concludes by using the model to form predictive distributions of one - to ten-day returns during volatile episodes for the SampP 500, dollar-pound and bond return series. Geweke, J. amp Jiang, Y. 2011, Inference and prediction in a multiple-structural-break model, Journal Of Econometrics . vol. 163, no. 2, pp. 172-185. ViewDownload from: UTS OPUS or Publishers site This paper develops a new Bayesian approach to structural break modeling. The focuses of the approach are the modeling of in-sample structural breaks and forecasting time series allowing out-of-sample breaks. The model has several desirable features. Fir Geweke, J. amp Amisano, G. 2011, Optimal prediction pools, Journal Of Econometrics . vol. 164, no. 1, pp. 130-141. ViewDownload from: UTS OPUS or Publishers site We consider the properties of weighted linear combinations of prediction models, or linear pools, evaluated using the log predictive scoring rule. Although exactly one model has limiting posterior probability, an optimal linear combination typically includes several models with positive weights. We derive several interesting results: for example, a model with positive weight in a pool may have zero weight if some other models are deleted from that pool. The results are illustrated using SampP 500 returns with six prediction models. In this example models that are clearly inferior by the usual scoring criteria have positive weights in optimal linear pools. Geweke, J. amp Amisano, G. 2011, Optimal prediction pools, vol. 164, no. 1, pp. 130-141. We consider the properties of weighted linear combinations of prediction models, or linear pools, evaluated using the log predictive scoring rule. Although exactly one model has limiting posterior probability, an optimal linear combination typically includes several models with positive weights. We derive several interesting results: for example, a model with positive weight in a pool may have zero weight if some other models are deleted from that pool. The results are illustrated using SampP 500 returns with six prediction models. In this example models that are clearly inferior by the usual scoring criteria have positive weights in optimal linear pools. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. 2011, Investment Risk Framing and Individual Preference Consistency, UNSW Australian School of Business Research Paper . Nein. 2010. Here we test the usefulness of a discrete choice experiment (DCE) for identifying individuals who consistently exhibit concave utility over returns to wealth, despite variations in the framing of risk. At the same time, we test the relative strengths of nine standard descriptions of investment risk. We ask a sample of 1200 retirement savings account holders to select their most and least preferred investment strategies from a menu of a safe (zero risk) savings account, a risky growth asset portfolio and a 50:50 share of both. We identify respondents who fail to conform with expected utility and test whether this behavior is predictable across different risk frames. Tests confirm that the DCE can help isolate individuals whose preferences violate global risk aversion despite variation in risk presentation. We also identify frames linked to significantly more consistent behavior by respondents. These are frames which simultaneously specify upside and downside risk. Frames that present risk as a frequency of failures or successes against a zero returns benchmark are more likely to generate violations of risk aversion. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. 2011, Financial Competence and Expectations Formation: Evidence from Australia, UNSW Australian School of Business Research Paper . Nein. 2011. We study the financial competence of Australian retirement savers using self-assessed and quantified measures. Responses to numeracy and basic and sophisticated financial literacy questions show large variation and compare poorly with international surveys. We graph the relationships between financial competence index scores and a wide range of demographics, economic outcomes and attitudes. Results show significant variation of basic and sophisticated financial literacy measures with most demographics, self-assessed financial competence, income, superannuation accumulation and net worth. General numeracy scores show patterns of variation different to basic and sophisticated financial literacy, being largely constant by gender, age, higher education and personal income. Our results confirm the usefulness of studying numeracy and financial knowledge-based skills separately. We also consider the impact of financial competence on expectations of stock market performance. Consumers with higher understanding of risk, diversification and financial assets are more likely to assign a probability to future financial crises, and a time-frame to share market recovery, rather than expressing uncertainty. Geweke, J. 2010, Bayesian Analysis of DSGE Models, Econometric Reviews . vol. 26, pp. 193-200. Geweke, J. amp Amisano, G. 2010, Comparing And Evaluating Bayesian Predictive Distributions Of Asset Returns, International Journal of Forecasting . vol. 26, no. 2, pp. 216-230. ViewDownload from: UTS OPUS or Publishers site Bayesian inference in a time series model provides exact out-of-sample predictive distributions that fully and coherently incorporate parameter uncertainty. This study compares and evaluates Bayesian predictive distributions from alternative models, usin Geweke, J. 2010, Comment, International Journal of Forecasting . vol. 26, no. 2, pp. 435-438. ViewDownload from: UTS OPUS or Publishers site The article by Zellner and Ando proposes methods for coping with the excess kurtosis that is often observed in disturbances in applications of the seemingly unrelated regressions (SUR) model. This is an important topic which is of particular relevance in Geweke, J. amp Amisano, G. 2010, Comparing and evaluating Bayesian predictive distributions of asset returns, vol. 26, no. 2, pp. 216-230. Bayesian inference in a time series model provides exact out-of-sample predictive distributions that fully and coherently incorporate parameter uncertainty. This study compares and evaluates Bayesian predictive distributions from alternative models, using as an illustration five alternative models of asset returns applied to daily SampP 500 returns from the period 1976 through 2005. The comparison exercise uses predictive likelihoods and is inherently Bayesian. The evaluation exercise uses the probability integral transformation and is inherently frequentist. The illustration shows that the two approaches can be complementary, with each identifying strengths and weaknesses in models that are not evident using the other. Ackerberg, D. Geweke, J. amp Hahn, J. 2009, Comments on Convergence Properties Likelihood of Computed Dynamic Models, Econometrica . vol. 77, no. 6, pp. 2009-2017. ViewDownload from: UTS OPUS or Publishers site We show by counterexample that Proposition 2 in Fernampaacutendez-Villaverde, Rubio - Ramampiacuterez, and Santos (Econometrica (2006), 74, 93119) is false. We also show that even if their Proposition 2 were corrected, it would be irrelevant for parameter estimates. As a more constructive contribution, we consider the effects of approximation error on parameter estimation, and conclude that second order approximation errors in the policy function have at most second order effects on parameter estimates. Geweke, J. amp Keane, M. 2007, Smoothly mixing regressions, Journal of Econometrics . vol. 138, no. 1, pp. 252-290. ViewDownload from: UTS OPUS or Publishers site This paper extends the conventional Bayesian mixture of normals model by permitting state probabilities to depend on observed covariates. The dependence is captured by a simple multinomial probit model. A conventional and rapidly mixing MCMC algorithm provides access to the posterior distribution at modest computational cost. This model is competitive with existing econometric models, as documented in the paper039s illustrations. The first illustration studies quantiles of the distribution of earnings of men conditional on age and education, and shows that smoothly mixing regressions are an attractive alternative to nonBayesian quantile regression. The second illustration models serial dependence in the SampP 500 return, and shows that the model compares favorably with ARCH models using out of sample likelihood criteria. Geweke, J. 2007, Interpretation and Inference in Mixture Models: Simple MCMC Works, Computational Statistics and Data Analysis . vol. 51, no. 7, pp. 3529-3550. ViewDownload from: UTS OPUS or Publishers site Abstract: The mixture model likelihood function is invariant with respect to permutation of the components of the mixture. If functions of interest are permutation sensitive, as in classification applications, then interpretation of the likelihood function requires valid inequality constraints and a very large sample may be required to resolve ambiguities. If functions of interest are permutation invariant, as in prediction applications, then there are no such problems of interpretation. Contrary to assessments in some recent publications, simple and widely used Markov chain Monte Carlo (MCMC) algorithms with data augmentation reliably recover the entire posterior distribution. Geweke, J. 2007, Bayesian model comparison and validation, American Economic Review . vol. 97, no. 2, pp. 60-64. ViewDownload from: UTS OPUS or Publishers site Bayesian econometrics provides a tidy theory and practical methods of comparing and combining several alternative, completely specified models for a common data set. It is always possible that none of the specified models describe important aspects of the data well. The investigation of this possibility, a process known as model validation or model specification checking, is an important part of applied econometric work. Bayesian theory and practice for model validation are less well developed. A well-established Bayesian literature argues that non-Bayesian methods are essential in model validation. This line of though persists in Bayesian econometrics as well the paper reviews these methods. The paper proposes an alternative, fully Bayesian method of model validation based on the concept of incomplete models, and argues that this method is also strategically advantageous in applied Bayesian econometrics. The article provides detailed and accurate illustrations of Bayesian analysis of DSGE models that are likely to be used increasingly in support of central bank policy making. These comments identify a dozen aspects of these methods, discussing how their application and improvement can contribute to effective support of policy. Geweke, J.. Groenen, P. J.E. Paap, R. amp van Dijk, H. K. 2007, Computational techniques for applied econometric analysis of macroeconomic and financial processes, COMPUTATIONAL STATISTICS amp DATA ANALYSIS . vol. 51, no. 7, pp. 3506-3508. ViewDownload from: Publishers site Geweke, J. 2007, Bayesian dynamic econometrics - Comment, Econometric Reviews . vol. 26, no. 2-4, pp. 193-200. ViewDownload from: Publishers site Abrantes-Metz, R. Froeb, L. Geweke, J. amp Taylor, C. 2006, A Variance Screen for Collusion, International Journal Of Industrial Organisation . vol. 24, no. 3, pp. 467-486. ViewDownload from: UTS OPUS or Publishers site Abstract: In this paper, we examine price movements over time around the collapse of a bid-rigging conspiracy. While the mean decreased by sixteen percent, the standard deviations increased by over two hundred percent. We hypothesize that conspiracies in other industries would exhibit similar characteristics and search for quotpocketsquot of low price variation as indicators of collusion in the retail gasoline industry in Louisville. We observe no such areas around Louisville in 1996-2002. Geweke, J. 2004, Getting it Right: Joint Distribution Tests of Posterior Simulators, Journal of the American Statistical Association . vol. 99, no. 467, pp. 799-804. ViewDownload from: UTS OPUS or Publishers site Abstract: Analytical or coding errors in posterior simulators can produce reasonable but incorrect approximations of posterior moments. This article develops simple tests of posterior simulators that detect both kinds of errors, and uses them to detect and correct errors in two previously published papers. The tests exploit the fact that a Bayesian model specifies the joint distribution of observables (data) and unobservables (parameters). There are two joint distribution simulators. The marginal conditional simulator draws unobservables from the prior and then observables conditional on unobservables. The successive-conditional simulator alternates between the posterior simulator and an observables simulator. Formal comparison of moment approximations of the two simulators reveals existing analytical or coding errors in the posterior simulator. Geweke, J. amp H, T. 2003, Note on the Sampling Distribution for the Metropolis-Hastings Algorithm, Communications In Statistics-theory And Methods . vol. 32, pp. 775-789. ViewDownload from: UTS OPUS or Publishers site Abstract: The Metropolis-Hastings algorithm has been important in the recent development of Bayes methods. This algorithm generates random draws from a target distribution utilizing a sampling (or proposal) distribution. This article compares the properties of three sampling distributions-the independence chain, the random walk chain, and the Taylored chain suggested by Geweke and Tanizaki (Geweke, J. Tanizaki, H. (1999). On Markov Chain Monte-Carlo methods for nonlinear and non-Gaussian state-space models. Communications in Statistics, Simulation and. Computation 28(4):867-894, Geweke, J. Tanizaki, H. (2001). Bayesian estimation of state-space model using the Metropolis-Hastings algorithm within Gibbs sampling. Computational Statistics and Data Analysis 37(2):151-170). Geweke, J.. Gowrisankaran, G. amp Town, R. 2003, Bayesian Inference for Hospital Quality in a Selection Model, Econometrica . vol. 71, pp. 1215-1238. ViewDownload from: UTS OPUS or Publishers site This paper develops new econometric methods to infer hospital quality in a model with discrete dependent variables and nonrandom selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patient039s residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 74,848 Medicare patients admitted to 114 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds the smallest and largest hospitals to be of the highest quality. There is strong evidence of dependence between the unobserved severity of illness and the assignment of patients to hospitals, whereby patients with a high unobserved severity of illness are disproportionately admitted to high quality hospitals. Consequently a conventional probit model leads to inferences about quality that are markedly different from those in this study039s selection model. Geweke, J. amp Durham, G. 2003, Iterative and Recursive Estimation in Structured Non-Adaptive Models, Journal of Business amp Economic Statistics . vol. 21, no. 4, pp. 490-492. This was an invited paper and the journal solicited various people to write comments on it. So I have a comment on this paper. It starts on page 490 at the end of the article. Geweke, J. 2003, Econometric issues in using the AHEAD panel, vol. 112, no. 1, pp. 115-120. Durham, G. amp Geweke, J. 2003, Journal of Business and Economic Statistics: Comment, Journal of Business and Economic Statistics . vol. 21, no. 4, pp. 490-492. Geweke, J. amp Martin, D. 2002, Pitfalls in Drawing Policy Conclusions from Retrospective Survey Data: The Case of Advertising and Underage Smoking, Journal of Risk and Uncertainty . vol. 83, pp. 1181-1186. Abstract: Measuring the impact of potentially controllable factors on the willingness of youth to undertake health risks is important to informed public health policy decisions. Typically the only data linking these factors with risk-taking behavior are retrospective. This study demonstrates, by means of a recent example, that there can be serious pitfalls in using even longitudinal retrospective data to draw conclusions about causal relations between potentially controllable factors and risk-taking behavior. Geweke, J. 2002, Commentary: Econometric issues in using the AHEAD Panel, Journal of Econometrics . vol. 112, no. 1, pp. 115-120. ViewDownload from: UTS OPUS or Publishers site This study provides an illuminating perspective on the relation between health and socio-economic status. It is notable in meeting, head on, various technical but critical issues that arise in using the AHEAD panel to address issues of causation between health and socio-economic status (SES). This panel provides multiple measures of both health and SES, and there is no prior consensus reduction of these many dimensions. Household wealth is the candidate summary measure of economic status, but as users of self-reported wealth know and the authors lucidly demonstrate, severe measurement errors raise a host of methodological problems of their own. These comments focus on the way the authors have addressed these and some of the other technical issues that have to be confronted in one way or another in order to address the central issues. Geweke, J. 2001, Bayesian Econometrics and Forecasting, Journal of Econometrics . vol. 100, no. 1, pp. 11-15. ViewDownload from: UTS OPUS or Publishers site Abstract: Contemporary Bayesian forecasting methods draw on foundations in subjective probability and preferences laid down in the mid-twentieth century, and utilize numerical methods developed since that time in their implementation. These methods unify the tasks of forecasting and model evaluation. They also provide tractable solutions for problems that prove difficult when approached using non-Bayesian methods. These advantages arise from the fact that the conditioning in Bayesian probability forecasting is the same as the conditioning in the underlying decision problems. Geweke, J. 2001, Bayesian inference and posterior simulators, Canadian Journal of Agricultural Economics . vol. 49, no. 3, pp. 313-325. ViewDownload from: UTS OPUS or Publishers site Abstract: Recent advances in simulation methods have made possible the systematic application of Bayesian methods to support decision making with econometric models. This paper outlines the key elements of Bayesian investigation, and the simulation methods applied to bring them to bear in application. Geweke, J. amp McCausland, W. J. 2001, Bayesian Specification Analysis in Econometrics, American Journal of Agricultural Economics . vol. 83, pp. 1181-1186. ViewDownload from: Publishers site Geweke, J. amp Tanizaki, H. 2001, Bayesian Estimation of State-Space Models Using Metropolis-Hastings Algorithm with Gibbs Sampling, Computational Statistics and Data Analysis . vol. 37, no. 2, pp. 151-170. ViewDownload from: UTS OPUS Abstract: In this paper, an attempt is made to show a general solution to nonlinear andor non-Gaussian state-space modeling in a Bayesian framework, which corresponds to an extension of Carlin et al. (J. Amer. Statist. Assoc. 87(418) (1992) 493ampacirc500) and Carter and Kohn (Biometrika 81(3) (1994) 541ampacirc553 Biometrika 83(3) (1996) 589ampacirc601). Using the Gibbs sampler and the MetropolisampacircHastings algorithm, an asymptotically exact estimate of the smoothing Geweke, J. 2001, A Note on Some Limitations of CRRS Utility, Economic Letters . vol. 71, no. 3, pp. 341-345. ViewDownload from: UTS OPUS Abstract: In a standard environment for choice under uncertainty with constant relative risk aversion (CRRA), the existence of expected utility is fragile with respect to changes in the distributions of random variables, changes in prior information, or the assumption of rational expectations. Geweke, J. amp Keane, M. 2000, An Empirical Analysis of Male Income Dynamics in the PSID: 1968-1989, Journal of Econometrics . vol. 96, no. 2, pp. 293-356. ViewDownload from: Publishers site Geweke, J. amp Keane, M. 2000, An empirical analysis of earnings dynamics among men in the PSID: 1968-1989, vol. 96, no. 2, pp. 293-356. Geweke, J.. Rust, J. amp Van Dijk, H. K. 2000, Introduction - Inference and decision making, Journal of Applied Econometrics . vol. 15, no. 6, pp. 545-546. ViewDownload from: Publishers site Geweke, J. amp Tanizaki, H. 1999, On Markov Chain Monte Carlo Methods for Nonlinear and Non-Gaussian State-Space Models, Communications In Statistics-Simulation And Computation . vol. 28, pp. 867-894. ViewDownload from: Publishers site Abstract: In this paper, a nonlinear andor non-Gaussian smoother utilizing Markov chain Monte Carlo Methods is proposed, where the measurement and transition equations are specified in any general formulation and the error terms in the state-space model are not necessarily normal. The random draws are directly generated from the smoothing densities. For random number generation, the Metropolis-Hastings algorithm and the Gibbs sampling technique are utilized. The proposed procedure is very simple and easy for programming, compared with the existing nonlinear and non-Gaussian smoothing techniques. Moreover, taking several candidates of the proposal density function, we examine precision of the proposed estimator. Geweke, J. 1999, Using Simulation Methods for Bayesian Econometric Models: Inference, Development and Communication, Econometric Reviews . vol. 18, no. 1, pp. 1-73. ViewDownload from: UTS OPUS Abstract: This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a fixed number of completely specified models, the paper introduces subjective Bayesian tools for formal comparison of these models with as yet incompletely specified models. The paper then shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators. A theme of the paper is the practicality of subjective Bayesian methods. To this end, the paper describes publicly available software for Bayesian inference, model development, and communication and provides illustrations using two simple econometric models. Geweke, J. 1999, Power of Tests in Binary Response Models: Comment, Econometrica . vol. 67, pp. 423-425. ViewDownload from: Publishers site Geweke, J. 1998, Real and Spurious Long Memory Properties of Stock Market Data., Journal of Business amp Economic Statistics . vol. 16, pp. 269-271. Geweke, J. amp Petrella, L. 1998, Prior Density Ratio Class Robustness in Econometrics, Journal of Business amp Economic Statistics . vol. 16, pp. 469-478. ViewDownload from: Publishers site Abstract: This paper provides a general and efficient method for computing density ratio class bounds on posterior moments, given the output of a posterior simulator. It shows how density ratio class bounds for posterior odds ratios may be formed in many situations, also on the basis of posterior simulator output. The computational method is used to provide density ratio class bounds in two economic models. It is found that the exact bounds are approximated poorly by their asymptotic approximation, when the posterior distribution of the function of interest is skewed. It is also found that the posterior odds ratios display substantial variation within the density ratio class, in ways that cannot be anticipated by the asymptotic approximation. Geweke, J.. Keane, M. amp Runkle, D. 1997, Statistical Inference In The Multinomial Multiperiod Probit Model, Journal Of Econometrics . vol. 80, no. 1, pp. 125-165. ViewDownload from: Publishers site Abstract: Statistical inference in multinomial multiperiod probit models has been hindered in the past by the high dimensional numerical integrations necessary to form the likelihood functions, posterior distributions, or moment conditions in these models. We describe three alternative estimators, implemented using simulation-based approaches to inference, that circumvent the integration problem: posterior means computed using Gibbs sampling and data augmentation (GIBBS), simulated maximum likelihood (SML) estimation using the GHK probability simulator, and method of simulated moment (MSM) estimation using GHK. We perform a set of Monte-Carlo experiments to compare the sampling distributions of these estimators. Although all three estimators perform reasonably well, some important differences emerge. Our most important finding is that, holding simulation size fixed, the relative and absolute performance of the classical methods, especially SML, gets worse when serial correlation in disturbances is strong. In data sets with an AR(1) parameter of 0.50, the RMSEs for SML and MSM based on GHK with 20 draws exceed those of GIBBS by 9 and 0, respectively. But when the AR(1) parameter is 0.80, the RMSEs for SML and MSM based on 20 draws exceed those of GIBBS by 79 and 37, respectively, and the number of draws needed to reduce the RMSEs to within 10 of GIBBS are 160 and 80 respectively. Also, the SML estimates of serial correlation parameters exhibit significant downward bias. Thus, while conventional wisdom suggests that 20 draws of GHK is enough039 to render the bias and noise induced by simulation negligible, our results suggest that much larger simulation sizes are needed when serial correlation in disturbances is strong. Geweke, J. amp Zhou, G. 1996, Measuring the Pricing Error of the Arbitrage Price Theory, Review of financial studies . vol. 9, no. 2, pp. 557-587. ViewDownload from: UTS OPUS Abstract: This article provides an exact Bayesian framework for analyzing the arbitrage pricing theory (APT). Based on the Gibbs sampler, we show how to obtain the exact posterior distributions for functions of interest in the factor model. In particular, we propose a measure of the APT pricing deviations and obtain its exact posterior distribution. Using monthly portfolio returns grouped by industry and market capitalization, we find that there is little improvement in reducing the pricing errors by including more factors beyond the first one. Geweke, J. 1996, Bayesian Reduced Rank Regression in Econometrics, Journal of Econometrics . vol. 75, no. 1, pp. 121-146. ViewDownload from: Publishers site Abstract: The reduced rank regression model arises repeatedly in theoretical and applied econometrics. To date the only general treatment of this model have been frequentist. This paper develops general methods for Bayesian inference with noninformative reference priors in this model, based on a Markov chain sampling algorithm, and procedures for obtaining predictive odds ratios for regression models with different ranks. These methods are used to obtain evidence on the number of factors in a capital asset pricing model Geweke, J. amp Zhou, G. 1996, Measuring the Pricing Error of the Arbitrage Pricing Theory., Review of Financial Studies . vol. 9, no. 2. This article provides an exact Bayesian framework for analyzing the arbitrage pricing theory (APT). Based on the Gibbs sampler, we show how to obtain the exact posterior distributions for functions of interest in the factor model. In particular, we propose a measure of the APT pricing deviations and obtain its exact posterior distribution. Using monthly portfolio returns grouped by industry and market capitalization, we find that there is little improvement in reducing the pricing errors by including more factors beyond the first one. Article published by Oxford University Press on behalf of the Society for Financial Studies in its journal, The Review of Financial Studies. Geweke, J. amp Runkle, D. 1995, A Fine Time for Monetary Policy, Federal Reserve Bank of Minneapolis Quarterly Review . vol. 19, no. 1, pp. 18-31. ViewDownload from: UTS OPUS Almost everyone would agree--even we in the Federal Reserve System--that monetary policy can be improved. But improving it requires accurate empirical descriptions of the current policy and the relationship between that policy and the economic variables policymakers care about. With those descriptions, we could, conceivably, predict how economic outcomes would change under alternative policies and hence find policies that lead to better economic outcomes. The first requirement of this policymaking problem is policy identification, and it is the focus of this study. Policy identification entails a specification of the instrument the Federal Reserve controls and a description of how that instrument is set based on information available when a policy decision is made. Because policy identification is a crucial step in the search for improved monetary policy, it has received much attention in the literature. Geweke, J. 1994, Bayesian Analysis of Stochastic Volatility Models, Journal of Business amp Economic Statistics . vol. 12, pp. 397-399. Geweke, J.. Keane, M. amp Horowitz, J. L. 1994, Advances in Random Utility Models, Marketing Letters . vol. 5, pp. 311-322. Abstract: In recent years, major advances have taken place in three areas of random utility modeling: (1) semiparametric estimation, (2) computational methods for multinomial probit models, and (3) computational methods for Bayesian estimation. This paper summarizes these developments and discusses their implications for practice. Geweke, J. 1994, Priors for Macroeconomic Time Series and Their Application, Econometric Theory . vol. 10, pp. 609-632. ViewDownload from: Publishers site Abstract: This paper takes up Bayesian inference in a general trend stationary model for macroeconomic time series with independent Student-t disturbances. The model is linear in the data, but non-linear in the parameters. An informative but nonconjugate family of prior distributions for the parameters is introduced, indexed by a single parameter which can be readily elicited. The main technical contribution is the construction of posterior moments, densities, and odds ratios using a six-step Gibbs sampler. Mappings from the index parameter of the family of prior distribution to posterior moments, densities, and odds ratios are developed for several of the Nelson-Plosser time series. These mappings show that the posterior distribution is not even approximately Gaussian, and indicate the sensitivity of the posterior odds ratio in favor of difference stationarity to the choice of prior distribution. Geweke, J.. Keane, M. amp Runkle, D. 1994, Alternative Computational Approaches to Statistical Inference In The Multinomial Probit Model, Review Of Economics And Statistics . vol. 76, no. 4, pp. 609-632. ViewDownload from: UTS OPUS or Publishers site This research compares several approaches to inference in the multinomial probit model, based on two Monte Carlo experiments for a seven choice model. The methods compared are the simulated maximum likelihood estimator using the GHK recursive probabilit Horowitz, J. L. Bolduc, D. Divakar, S. Geweke, J.. Gnl, F. Hajivassiliou, V. Koppelman, F. S. Keane, M. Matzkin, R. Rossi, P. amp Ruud, P. 1994, Advances in random utility models report of the workshop on advances in random utility models duke invitational symposium on choice modeling behavior, Marketing Letters . vol. 5, no. 4, pp. 311-322. ViewDownload from: Publishers site In recent years, major advances have taken place in three areas of random utility modeling: (1) semiparametric estimation, (2) computational methods for multinomial probit models, and (3) computational methods for Bayesian estimation. This paper summarizes these developments and discusses their implications for practice. ampcopy 1994 Kluwer Academic Publishers. Geweke, J. amp Terui, N. 1993, Bayesian Threshold Autoregressive Models for Nonlinear Time Series, Journal of Time Series Analysis . vol. 14, pp. 441-445. Abstract: This paper provides a Bayesian approach to statistical inference in the threshold autoregressive model for time series. The exact posterior distribution of the delay and threshold parameters is derived, as is the multi-step-ahead predictive density. The proposed methods are applied to the Wolfe039s sunspot and Canadian lynx data sets. Geweke, J. 1993, Forecasting Time Series with Common Seasonal Patterns, Journal of Econometrics . vol. 55, pp. 201-202. Geweke, J. 1993, Discussion on the Gibbs Sampler and Other Markov Chain Monte Carlo Methods., Journal Of The Royal Statistical Society Series B-methodological . vol. 55, p. 74. Clifford, P. Jennison, C. Wakefield, J. Phillips, D. Frigessi, A. Gray, A. Lawson, A. Forster, J. Ramgopal, P. Arslan, O. Constable, P. Kent, J. Wolff, R. Harding, E. Middleton, R. Diggle, P. Aykroyd, R. Berzuini, C. Brewer, M. amp Aitken, C. 1993, Discussion On The Meeting On The Gibbs Sampler And Other Markov Chain-Monte Carlo Methods, Journal Of The Royal Statistical Society Series B-Methodological . vol. 55, no. 1, pp. 53-102. GEWEKE, J. 1993, BAYESIAN TREATMENT OF THE INDEPENDENT STUDENT-T LINEAR-MODEL, JOURNAL OF APPLIED ECONOMETRICS . vol. 8, pp. S19-S40. ViewDownload from: Publishers site GEWEKE, J. 1993, REMARKS ON MY TERM AT JBES, JOURNAL OF BUSINESS amp ECONOMIC STATISTICS . vol. 11, no. 4, pp. 427-427. Geweke, J. 1992, Inference and Prediction in the Presence of Uncertainty and Determinism, Statistical Science . vol. 7, pp. 94-101. Geweke, J. 1991, Generic, Algorithmic Approaches to Monte Carlo Integration in Bayesian Inference, Contemporary Mathematics . vol. 115, pp. 117-135. Abstract: Program of research in generic, algorithmic approaches to Monte Carlo integration in Bayesian inference is summarized. The goal of this program is the development of a widely applicable family of solutions of Bayesian multiple integration problems, that obviate the need for case-by-case treatment of arcane problems in numerical analysis. The essentials of the Bayesian inference problem, with some reference to econometric applications, are set forth. Fundamental results in Monte Carlo integration are derived and their current implementation in software is described. Potential directions for fruitful new research are outlined. Geweke, J.. Barnett, W. amp Wolfe, M. 1991, Seminonparametric Bayesian Estimation of the Asymptotically Ideal Production Model, Journal of Econometrics . vol. 49, pp. 5-50. ViewDownload from: Publishers site Abstract: Recently it has been shown that seminonparametric methods can be used to produced high-quality approximations to a firm039s technology. Unlike the local approximations provided by the conventional class of flexible functional forms039, seminonparametric methods generate global spans within large classes of functions. However, that approach usually spans a much larger space than the neoclassical function space relevant to most production modeling. An exception is the asymptotically ideal model (AIM) generated from the MampAtildeampfrac14ntz-Szatz series expansion. Since every basis function in that expansion is within the neoclassical function space, a straightforward method exists for imposing neoclassical regularity, when all factors are substitutes. Since the relevant constraints are inequality restrictions, we implement the approach using Bayesian methods to avoid the problems of sampling distribution truncation that would occur from sampling theoretic methods. We further discuss the relevant extensions that would permit complementary factors, nonconstant returns to scale, and technological change. Geweke, J.. Matchar, D. Simel, D. amp Feussner, J. 1990, A Bayesian Method for Evaluating Medical Test Operating Characteristics When Some Patients Fail to be Diagnosed by the Reference Standard., Medical Decision Making . vol. 10, pp. 114-115. Matchar, D. Simel, D. Geweke, J. amp Feussner, J. 1990, A Bayesian Method for Evaluating Medical Test Operating Characteristics When Some Patients Condituions Fail to be Diagnosed by the Reference Standard, Medical Decision Making . vol. 10, no. 2, pp. 102-115. ViewDownload from: Publishers site Abstract: The evaluation of a diagnostic test when the reference standard fails to establish a diagnosis in some patients is a common and difficult analytical problem. Conventional operating characteristics, derived from a 2 x 2 matrix, require that tests have only positive or negative results, and that disease status be designated definitively as present or absent. Results can be displayed in a 2 x 3 matrix, with an additional column for undiagnosed patients, when it is not possible always to ascertain the disease status definitively. The authors approach this problem using a Bayesian method for evaluating the 2 x 3 matrix in which test operating characteristics are described by a joint probability density function. They show that one can derive this joint probability density function of sensitivity and specificity empirically by applying a sampling algorithm. The three-dimensional histogram resulting from this sampling procedure approximates the true joint probability density function for sensitivity and specificity. Using a clinical example, the authors illustrate the method and demonstrate that the joint probability density function for sensitivity and specificity can be influenced by assumptions used to interpret test results in undiagnosed patients. This Bayesian method represents a flexible and practical solution to the problem of evaluating test sensitivity and specificity when the study group includes patients whose disease could not be diagnosed by the reference standard. Keywords: Bayesian analysis test operating characteristics probability density functions. (Med Decis Making 199010:102-111) Geweke, J. 1989, Bayesian Inference in Econometric Models Using Monte Carlo Integration, Econometrica . vol. 57, no. 6, pp. 1317-1339. ViewDownload from: Publishers site Abstract: Methods for the systematic application of Monte Carlo integration with importance sampling to Bayesian inference in econometric models are developed. Conditions under which the numerical approximation of a posterior moment converges almost surely to the true value as the number of Monte Carlo replications increases, and the numerical accuracy of this approximation may be assessed reliably, are set forth. Methods for the analytical verification of these conditions are discussed. Importance sampling densities are derived from multivariate normal of Student t approximations to local behavior of the posterior density at its mode. These densities are modified by automatic rescaling along each axis. The concept of relative numerical efficiency is introduced to evaluate the adequacy of a chosen importance sampling density. The practical procedures based on these innovations are illustrated in two different models Geweke, J. 1989, Exact Predictive Densities in Linear Models with ARCH Disturbances, Journal of Econometrics . vol. 40, pp. 63-86. ViewDownload from: Publishers site Abstract: It is shown how exact predictive densities may be formed in the ARCH linear model by means of Monte Carlo integration with importance sampling. Several improvements in computational efficiency over earlier implementations of this procedure are developed, including use of the exact likelihood function rather than an asymptotic approximation to construct the importance sampling distribution, and antithetic acceleration of convergence. A numerical approach to the formulation of posterior odds ratios and the combination of non-nested models is also introduced. These methods are applied to daily quotations of closing stock prices. Forecasts are formulated using linear models, ARCH linear models and an integrated model constructed from the posterior probabilities of the respective models. The use of the exact predictive density in a decision-theoretic context is illustrated by deriving the optimal day-to-day portfolio adjustments of a trader with constant relative risk aversion. Geweke, J. 1989, Sensitivity Analysis of Seasonal Adjustments: Empirical Case Studies, Journal of the American Statistical Association . vol. 84, pp. 28-30. CARLIN, J. B. DEMPSTER, A. P. PIERCE, D. A. BELL, W. R. CLEVELAND, W. S. WATSON, M. W. amp GEWEKE, J. 1989, SENSITIVITY ANALYSIS OF SEASONAL ADJUSTMENTS - EMPIRICAL CASE STUDIES - COMMENTS, JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION . vol. 84, no. 405, pp. 6-30. ViewDownload from: Publishers site Geweke, J. 1988, Antithetic Acceleration of Monte Carlo Integration in Bayesian Inference, Journal of Econometrics . vol. 38, no. 1-2, pp. 73-90. ViewDownload from: UTS OPUS or Publishers site It is proposed to sample antithetically rather than randomly from the posterior density in Bayesian inference using Monte Carlo integration. Conditions are established under which the number of replications required with antithetic sampling relative to the number required with random sampling is inversely proportional to sample size, as sample size increases. The result is illustrated in an experiment using a bivariate vector autoregression. Geweke, J. 1988, An Application of Operational-Subjective Statistical Methods to Rational Expectations, Journal Of Business amp Economic Statistics . vol. 6, pp. 465-466. Geweke, J. 1988, Operational Bayesian Methods in Econometrics, Journal of Economic Perspectives . vol. 2, pp. 159-166. Geweke, J. 1988, Employment Discrimination and Statistical Science, Statistical Science . vol. 3, pp. 188-189. Geweke, J. 1988, Checks of Model Adequacy for Univariate Time Series Models and Their Application to Econometric Relationships., Econometric Reviews . vol. 7, no. 1, pp. 59-62. Geweke, J. 1988, The Secular and Cyclical Behavior of Real GDP in Nineteen OECD Countries, 1957-1983, Journal of Business amp Economic Statistics . vol. 6, pp. 479-486. Abstract: Log per capita real gross domestic product is modeled as a third-order autoregression with a pair of complex roots whose amplitude is smaller than the amplitude of the real root. The behavior of this time series is interpreted in terms of these two amplitudes, the periodicity of the complex roots, and the standard deviation of the disturbance. Restrictions are evaluated and inference is conducted using the likelihood principle, applying Monte Carlo integration with importance sampling. These Bayesian procedures efficiently cope with restrictions that are awkward taking a classical approach. We find very little difference in the amplitudes of real roots between countries and of complex roots relative to within-country uncertainty. There are some substantial differences in the periodicities of complex roots, and the greatest differences between countries are found in the standard deviation of the disturbance. GEWEKE, J. 1988, COMMENT ON POIRIER - OPERATIONAL BAYESIAN METHODS IN ECONOMETRICS, JOURNAL OF ECONOMIC PERSPECTIVES . vol. 2, no. 1, pp. 159-166. Geweke, J. amp Froeb, L. 1987, Long Run Competition in the U. S. Aluminum Industry, International Journal Of Industrial Organisation . vol. 5, pp. 67-78. ViewDownload from: Publishers site Abstract: A methodology for examining dynamic structure-performance relationships in a single industry is proposed and illustrated. Implications of long run competitive behavior for a simple simultaneous equations model of structure and performance are derived and tested using recently developed methods for the interpretation of economic time series. It is concluded that the structure and performance in the U. S. aluminum industry in the postwar period conform well with the hypothesis that the primary aluminum market was competitive in the long run. Geweke, J.. Marshall, R. amp Zarkin, G. 1986, Mobility Indices in Continuous Time Markov Chains, Econometrica . vol. 54, pp. 1407-1423. ViewDownload from: Publishers site Abstract: The axiomatic derivation of mobility indices for first-order Markov chain models in discrete time is extended to continuous-time models. Many of the logical inconsistencies among axioms noted in the literature for the discrete time models do not arise for continuous time models. It is shown how mobility indices in continuous time Markov chains may be estimated from observations at two points in time. Specific attention is given to the case in which the states are fractiles, and an empirical example is presented. Geweke, J.. Marshall, R. amp Zarkin, G. 1986, Exact Inference for Continuous Time Markov Chains, Review Of Economic Studies . vol. 53, pp. 653-669. ViewDownload from: Publishers site Abstract: Methods for exact Bayesian inference under a uniform diffuse prior are set forth for the continuous time homogeneous Markov chain model. It is shown how the exact posterior distribution of any function of interest may be computed using Monte Carlo integration. The solution handles the problems of embeddability in a very natural way, and provides (to our knowledge) the only solution that systematically takes this problem into account. The methods are illustrated using several sets of data. Geweke, J. 1986, Exact Inference in the Inequality Constrained Normal Linear Regression Model, Journal of Applied Econometrics . vol. 1, pp. 127-141. Abstract: Inference in the inequality constrained normal linear regression model is approached as a problem in Bayesian inference, using a prior that is the product of a conventional uninformative distribution and an indicator function representing the inequality constraints. The posterior distribution is calculated using Monte Carlo numerical integration, which leads directly to the evaluation of expected values of functions of interest. This approach is compared with others that have been proposed. Three empirical examples illustrate the utility of the proposed methods using an inexpensive 32-bit microcomputer. Geweke, J. 1986, The Superneutrality of Money in the United States: An Interpretation of the Evidence, Econometrica . vol. 54, pp. 1-22. ViewDownload from: Publishers site Abstract: Structural and stochastic neutrality have refutable implications for aggregate economic time series only in conjunction with other maintained hypotheses. Simple and commonly employed maintained hypotheses lead to restrictions on measures of feedback and their decomposition by frequency. These restrictions also suggest an empirical interpretation of the notional long and short runs. It is found that a century of annual U. S. data, and postwar monthly data, consistently support structural superneutrality of money with respect to output and the real rate of return and consistently reject its superneutrality with respect to velocity. A quantitative characterization of the long run is suggested. Geweke, J. 1986, Modeling Conditional Variance, Econometric Reviews . vol. 5, no. 1, pp. 57-61. Geweke, J. 1985, Macroeconomic Modeling and the Theory of the Representative Agent, American Economic Review . vol. 75, pp. 206-210. Geweke, J. amp Porter-Hudak, S. 1984, The Estimation and Application of Long Memory Time Series Models, Journal of Time Series Analysis . vol. 4, pp. 221-238. Abstract: The definitions of fractional Gaussian noise and integrated (or fractionally differenced) series are generalized, and it is shown that the two concepts are equivalent. A new estimator of the long memory parameter in these models is proposed, based on the simple linear regression of the log periodogram on a deterministic regressor. The estimator is the ordinary least squares estimator of the slope parameter in this regression, formed using only the lowest frequency ordinates of the log periodogram. Its asymptotic distribution is derived, from which it is evident that the conventional interpretation of these least squares statistics is justified in large samples. Using synthetic data the asymptotic theory proves to be reliable in samples of 50 observations or more. For three postwar monthly economic time series, the estimated integrated series model provides more reliable out-of-sample forecasts than do more conventional procedures. Geweke, J. 1984, Measures of Conditional Linear Dependence and Feedback, Journal of the American Statistical Association . vol. 79, pp. 907-915. ViewDownload from: Publishers site Abstract: Measures of linear dependence and feedback for two multiple time series conditional on a third are defined. The measure of conditional linear dependence is the sum of linear feedback from the first to the second conditional on the third, linear feedback from the second to the first conditional on the third, and instantaneous linear feedback between the first and second series conditional on the third. The measures are non-negative and may be expressed in terms of measures of unconditional feedback between various combinations of the three series. The measures of conditional linear feedback can be additively decomposed by frequency. Estimates of these measures are straightforward to compute, and their distribution can be routinely approximated by bootstrap methods. An empirical example involving real output, money, and interest rates is presented. Geweke, J. amp Weisbrod, B. 1984, How Does Technological Change Affect Health Care Expenditures The Case of a New Drug, Evaluation Review . vol. 8, no. 1, pp. 75-92. ViewDownload from: Publishers site Abstract: The expenditure consequences of the drug cimetidine for the period 1977-1979 are investigated. Using Medicaid data for the State of Michigan, it is found that expenditures for the first year of treatment of duodenal ulcers are reduced between 26 and 70 The methodology employed can be applied to the assessment of other medical technologies. Geweke, J. 1984, The Indispensable Art of Econometrics, Journal of the American Statistical Association . vol. 79, pp. 25-26. Geweke, J. 1984, Forecasting and Conditional Projection Using Realistic Prior Distributions, Econometric Reviews . vol. 5, no. 1, pp. 105-112. Geweke, J. amp Meese, R. 1984, A Comparison of Autoregressive Univariate Forecasting Procedures for Macroeconomic Time Series, Journal Of Business amp Economic Statistics . vol. 2, pp. 187-202. Abstract: The actual performance of several automated univariate autoregressive forecasting procedures, applied to 150 macroeconomic time series, are compared. The procedures are the random walk model as a basis for comparison long autoregressions, with three alternative rules for lag length selection and a long autoregression estimated by minimizing the sum of absolute deviations. The sensitivity of each procedure to preliminary transformations, data, periodicity, forecast horizon, loss function employed in parameter estimation, and seasonal adjustment procedures is examined. The more important conclusions are that Akaike039s lag-length selection criterion works well in a wide variety of situations, the modeling of long memory components becomes important for forecast horizons of three or more periods, and linear combinations of forecasts do not improve forecast quality appreciably. Meese, R. amp Geweke, J. 1984, A Comparison of Autoregressive Univariate Forecasting Procedures for Macroeconomic Time Series., Journal of Business and Economic Statistics . vol. 2, no. 3, pp. 191-200. GEWEKE, J. 1984, THE INDISPENSABLE ART OF ECONOMETRICS - COMMENT, JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION . vol. 79, no. 385, pp. 25-26. Geweke, J.. Meese, R. amp Dent, W. 1983, Comparing Alternative Tests of Causality in Temporal Systems: Analytic Results and Experimental Evidence, Journal of Econometrics . vol. 21, pp. 161-194. ViewDownload from: Publishers site Abstract: This paper discusses eight alternative tests of the absence of casual ordering, all of which are asymptotically valid under the null hypothesis in the sense that their limiting size is known. Their behavior under alternatives is compared analytically using the concept of approximate slope, and these results are supported by the outcomes of Monte Carlo experiments. The implications of these comparisons for applied work are unambiguous: Wald variants of a test attributed to Granger, and a lagged dependent variable version of Sim039s test introduced in this paper, are equivalent in all relevant respects and are preferred to the other tests discussed. Geweke, J. amp Weisbrod, B. 1982, Clinical Evaluation vs. Economic Evaluation: The Case of a New Drug, Medical Care . vol. 20, pp. 821-830. ViewDownload from: Publishers site Abstract: To economically evaluate a new. drug or other medical innovation one must assess both the changes in costs and in benefits. Safety and efficacy matter, but so do resource costs and social benefits. This paper evaluates the effects on expenditures of the recent introduction of cimetidine, a drug used in the prevention and treatment of duodenal ulcers. This evaluation is of interest in its own right and also as a quotguidequot for studying similar effects of other innovations. State Medicaid records are used to test the effects on hospitalization and aggregate medical care expenditures of this new medical innovation. After controlling to the extent possible for potential selection bias, we find that: 1) usage of cimetidine is associated with a lower level of medical care expenditures and fewer days of hospitalization per patient for those duodenal ulcer patients who had zero health care expenditures and zero days of hospitalization during the presample period an annual cost saving of some 320.00 (20 per cent) per patient is indicated. Further analysis disclosed, however, that this saving was lower for patients with somewhat higher levels of health care expenditures and hospitalization in the presample period, and to some extent was reversed for the patients whose prior year039s medical care expenditures and hospitalization were highest. Geweke, J.. Parzen, E. Pierce, D. Wei, W. amp Zellner, A. 1982, The Measurement of Linear Dependence and Feedback Between Multiple Time Series, Journal of the American Statistical Association . vol. 77, pp. 304-324. ViewDownload from: Publishers site Abstract: Measures of linear dependence and feedback for multiple time series are defined. The measure of linear dependence is the sum of the measure of linear feedback from the first series to the second, linear feedback from the second to the first, and instantaneous linear feedback. The measures are nonnegative, and zero only when feedback (causality) of the relevant type is absent. The measures of linear feedback from one series Geweke, J. 1982, Measurement of Linear Dependence and Feedback Between Multiple Time Series, Journal of the American Statistical Association . Geweke, J. amp Singleton, K. 1981, Latent Variable Models for Time Series: A Frequency Domain Approach with an Application to the Permanent Income Hypothesis, Journal of Econometrics . vol. 17, no. 3, pp. 287-304. ViewDownload from: UTS OPUS or Publishers site Abstract: The theory of estimation and inference in a very general class of latent variable models for time series is developed by showing that the distribution theory for the finite Fourier transform of the observable variables in latent variable models for time series is isomorphic to that for the observable variables themselves in classical latent variable models. This implies that analytic work on classical latent variable models can be adapted to latent variable models for time series, an implication which is illustrated here in the context of a general canonical form. To provide an empirical example a latent variable model for permanent income is developed, its parameters are shown to be identified, and a variety of restrictions on these parameters implied by the permanent income hypothesis are tested. Geweke, J. 1981, The Approximate Slopes of Econometric Tests, Econometrica . vol. 49, no. 6, pp. 1427-1442. ViewDownload from: UTS OPUS or Publishers site Abstract: In this paper the concept of approximate slope, introduced by R. R. Bahadur, is used to make asymptotic global power comparisons of econometric tests. The approximate slope of a test is the rate at which the logarithm of the asymptotic marginal significance level of the test decreases as sample size increases, under a given alternative. A test with greater approximate slope may therefore be expected to reject the null hypothesis more frequently under that alternative than one with smaller approximate slope. Two theorems, which facilitate the computation and interpretation of the approximate slopes of most econometric tests, are established. These results are used to undertake some illustrative comparisons. Sampling experiments and an empirical illustration suggest that the comparison of approximate slopes may provide an adequate basis for evaluating the actual performance of alternative tests of the same hypothesis Geweke, J. amp Meese, R. 1981, Estimating Regression Models of Finite but Unknown Order, International Economic Review . vol. 22, no. 1, pp. 54-70. ViewDownload from: UTS OPUS Examines problems associated with the estimation of the normal linear regression model of finite but unknown sequence of nested alternatives. Estimation criteria for the model selection Derivation of the numerical bounds on the finite sample distribution Relation of the estimation criterion functions proposed to other estimation criterion functions. Geweke, J. amp Singleton, K. 1981, Maximum Likelihood Confirmatory Factor Analysis of Economic Time Series, International Economic Review . vol. 22, no. 1, pp. 37-54. ViewDownload from: UTS OPUS or Publishers site Explains the theory of identification, estimation and inference in the dynamic confirmatory factor model for the economic time series. Derivation of the frequency domain representation of the model Illustration of the nature of the identification problem for the dynamic confirmatory model Dynamic confirmatory model of the business cycle motivated by Lucas theory of aggregate activity. Geweke, J. 1981, A Comparison of Tests of the Independence of Two Covariance Stationary Time Series, Journal of the American Statistical Association . vol. 76, no. 374, pp. 363-373. ViewDownload from: UTS OPUS or Publishers site Abstract: The approximate slopes of several tests of the independence of two covariance stationary time series are derived and compared. It is shown that the approximate slopes of regression tests are at least as great as those based on the residuals of univariate ARIMA models, and that there are cases in which the former are arbitrarily great while the latter are arbitrarily small. These analytical findings are supported by a Monte Carlo study that shows that in samples of size 100 and 250 the asymptotic distribution theory under the null hypothesis is adequate for all tests, but under alternatives to the null hypothesis the rate of Type II error for the test based on ARIMA model residuals is often more than double that of the regression tests. Geweke, J. amp Meese, R. 1981, Estimating regression models of finite but unknown order, vol. 16, no. 1, pp. 162-162. GEWEKE, J. amp MEESE, R. 1981, ESTIMATING REGRESSION-MODELS OF FINITE BUT UNKNOWN ORDER, INTERNATIONAL ECONOMIC REVIEW . vol. 22, no. 1, pp. 55-70. ViewDownload from: Publishers site Geweke, J. amp Singleton, K. 1980, Interpreting the Likelihood Ratio Statistic in Factor Models When Sample Size is Small, Journal of the American Statistical Association . vol. 75, no. 369, pp. 133-137. ViewDownload from: UTS OPUS or Publishers site Abstract: The use of the likelihood ratio statistic in testing the goodness of fit of the exploratory factor model has no formal justification when, as is often the case in practice, the usual regularity conditions are not met. In a Monte Carlo experiment it is found that the asymptotic theory seems to be appropriate when the regularity conditions obtain and sample size is at least 30. When the regularity conditions are not satisfied, the asymptotic theory seems to be misleading in all sample sizes considered. Geweke, J. amp Feige, E. 1979, Some Joint Tests of the Efficiency of Markets for Forward Foreign Exchange, Review Of Economics And Statistics . vol. 61, no. 3, pp. 334-341. Geweke, J. 1978, Testing the Exogeneity Specification in the Complete Dynamic Simultaneous Equation Model, Journal of Econometrics . vol. 7, no. 2, pp. 163-185. ViewDownload from: UTS OPUS Abstract: It is shown that in the complete dynamic simultaneous equation model exogenous variables cause endogenous variables in the sense of Granger (1969) and satisfy the criterion of econometric exogeneity discussed by Sims (1977a), but that the stationarity assumptions invoked by Granger and Sims are not necessary for this implication. Inference procedures for testing each implication are presented and a new joint test of both implications is derived. Detailed attention is given to estimation and testing when the error vector of the final form of the complete dynamic simultaneous equation model is both singular and serially correlated. The theoretical points of the paper are illustrated by testing the exogeneity specification in a small macroeconometric model. Geweke, J. 1978, Temporal Aggregation in the Multiple Regression Model, Econometrica . vol. 46, no. 3, pp. 643-661. ViewDownload from: UTS OPUS or Publishers site Abstract: The regression relation between regularly sampled Y(t) and Xquot1(t). XquotN(t) implied by an underlying model in which time enters more generally is studied. The underlying model includes continuous distributed lags, discrete models, and stochastic differential equations as special cases. The relation between parameters identified by regular samplings of Y and Xquotj and those of the underlying model is characterized. Sufficient conditions for identification of the underlying model in the limit as disaggregation over time proceeds are set forth. Empirical evidence presented suggests that important gains can be realized from temporal disaggregation in the range of conventional measurement frequencies for macroeconomic data. Geweke, J. 1976, A monetarist model of inflationary expectations. John Rutledge, (D. C. Health, Lexington, Massachusetts, 1974) pp. xv115, 12.50, vol. 2, no. 1, pp. 125-127. Geweke, J. 1975, A Monetarist Model of Inflationary Expectations, Journal of Monetary Economics . Geweke, J.. Durham, G. amp Xu, H. Bayesian Inference for Logistic Regression Models Using Sequential Posterior Simulation. The logistic specification has been used extensively in non-Bayesian statistics to model the dependence of discrete outcomes on the values of specified covariates. Because the likelihood function is globally weakly concave estimation by maximum likelihood is generally straightforward even in commonly arising applications with scores or hundreds of parameters. In contrast Bayesian inference has proven awkward, requiring normal approximations to the likelihood or specialized adaptations of existing Markov chain Monte Carlo and data augmentation methods. This paper approaches Bayesian inference in logistic models using recently developed generic sequential posterior simulaton (SPS) methods that require little more than the ability to evaluate the likelihood function. Compared with existing alternatives SPS is much simpler, and provides numerical standard errors and accurate approximations of marginal likelihoods as by-products. The SPS algorithm for Bayesian inference is amenable to massively parallel implementation, and when implemented using graphical processing units it is more efficient than existing alternatives. The paper demonstrates these points by means of several examples. Geweke, J.. Gowrisankaran, G. amp Town, R. J. Inferring Hospital Quality from Patient Discharge Records Using a Bayesian Selection Model. This paper develops new econometric methods to estimate hospital quality and other models with discrete dependent variables and non-random selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patients residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 77,937 Medicare patients admitted to 117 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds higher quality in smaller hospitals than larger, and in private for-profit hospitals than in hospitals in other ownership categories. Variations in unobserved severity of illness across hospitals is at least a great as variation in hospital quality. Consequently a conventional probit model leads to inferences about quality markedly different than those in this studys selection model. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. amp Thorp, S. J. 2011, Financial competence, risk presentation and retirement portfolio preferences.. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. E. amp Thorp, S. J. 2011, Financial Competence and Expectations Formation: Evidence from Australia. Geweke, J. 2010, Comment: Financial Convergance Properties of the Likelihood of Computed Dynamic Models. There are technical errors in this article (Econometrica, January 2006) that are important, simple and correctable. The corrections substantially alter the article039s conclusions. Berg, J. E. Geweke, J. amp Rietz, T. A. 2010, Memoirs of an Indifferent Trader: Estimating Forecast Distributions from Prediction Markets. ViewDownload from: UTS OPUS Prediction markets for future events are increasingly common and they often trade several contracts for the same event. This paper considers the distribution of a normative risk-neutral trader who, given any portfolio of contracts traded on the event, would choose not to reallocate that portfolio of contracts even if transactions costs were zero. Because common parametric distributions can conflict with observed prediction market prices, the distribution is given a nonparametric representation together with a prior distribution favoring smooth and concentrated distributions. Posterior modal distributions are found for popular vote shares of the U. S. presidential candidates in the 100 days leading up to the elections of 1992, 1996, 2000, and 2004, using bid and ask prices on multiple contracts from the Iowa Electronic Markets. On some days, the distributions are multimodal or substantially asymmetric. The derived distributions are more concentrated than the historical distribution of popular vote shares in presidential elections, but do not tend to become more concentrated as time to elections diminishes. Geweke, J.. Ackerberg, D. amp Hahn, J. 2010, Comments on Convergance Properties of the Likelihood of Computed Dynamic Models.. Geweke, J. 2010, Bayesian and Non-Bayesian Analysis of the Seemingly Unrelated Regression Model with Student-t Errors and Its Application to Forecasting: Comment. Bateman, H. Eckert, C.. Geweke, J.. Louviere, J. J. Satchell, S. amp Thorp, S. J. 2010, Economic Rationality, Risk Presentation, and Retirement Portfolio Choice. Geweke, J.. Horowitz, J. amp Pesaran, M. H. 2006, Econometrics: A Birds Eye View. As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semi-parametric and non-parametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledged and attempts have been made to take them into account either by integrating out their effects or by modeling the sources of heterogeneity when suitable panel data exists. The counterfactual considerations that underlie policy analysis and treatment evaluation have been given a more satisfactory foundation. New time series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Non-linear econometric techniques are used increasingly in the analysis of cross section and time series observations. Applications of Bayesian techniques to econometric problems have been given new impetus largely thanks to advances in computer power and computational techniques. The use of Bayesian techniques have in turn provided the investigators with a unifying framework where the tasks of forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process thus paving the way for establishing the foundation of real time econometrics. This paper attempts to provide an overview of some of these developments. Geweke, J.. Gowrisankaran, G. amp Jul, R. J.T. X.-.I. 2002, Bayesian inference for hospital quality in a selection model. This paper develops new econometric methods to infer hospital quality in a model with discrete dependent variables and non-random selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patient039s residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 74,848 Medicare patients admitted to 114 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds the smallest and largest hospitals to be of the highest quality. There is strong evidence of dependence between the unobserved severity of illness and the assignment of patients to hospitals, whereby patients with a high unobserved severity of illness are disproportionately admitted to high quality hospitals. Consequently a conventional probit model leads to inferences about quality markedly different than those in this study039s selection model. Geweke, J. F. 1998, Using simulation methods for Bayesian econometric models: inference, development, and communication. This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a fixed number of completely specified models, the paper introduces subjective Bayesian tools for formal comparison of these models with as yet incompletely specified models. The paper then shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators. A theme of the paper is the practicality of subjective Bayesian methods. To this end, the paper describes publicly available software for Bayesian inference, model development, and communication and provides illustrations using two simple econometric models. Geweke, J. F.. Keane, M. P. amp Runkle, D. E. 1994, Alternative computational approaches to inference in the multinomial probit model. This research compares several approaches to inference in the multinomial probit model, based on Monte-Carlo results for a seven choice model. The experiment compares the simulated maximum likelihood estimator using the GHK recursive probability simulator, the method of simulated moments estimator using the GHK recursive simulator and kernel-smoothed frequency simulators, and posterior means using a Gibbs sampling-data augmentation algorithm. Each estimator is applied in nine different models, which have from 1 to 40 free parameters. The performance of all estimators is found to be satisfactory. However, the results indicate that the method of simulated moments estimator with the kernel-smoothed frequency simulator does not perform quite as well as the other three methods. Among those three, the Gibbs sampling-data augmentation algorithm appears to have a slight overall edge, with the relative performance of MSM and SML based on the GHK simulator difficult to determine. Geweke, J.. Using Simulation Methods for Bayesian Econometric Models. This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and their implementation using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. The paper shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators. Geweke, J. amp Keane, M. An Empirical Analysis of Income Dynamics among Men in the PSID: 19681989. This study uses data from the Panel Survey of Income Dynamics (PSID) to address a number of questions about life-cycle earnings mobility. It develops a dynamic reduced-form model of earnings and marital status that is nonstationary over the life-cycle. A Gibbs sampling-data augmentation algorithm facilitates use of the entire sample and provides numerical approximations to the exact posterior distribution of properties of earnings paths. This algorithm copes with the complex distribution of endogenous variables that are observed for short segments of an individual039s work history, not including the initial period. ltpgtThe study reaches several firm conclusions about life cycle earnings mobility. Incorporating non-Gaussian shocks makes it possible to account for transitions between low and higher earnings states, a heretofore unresolved problem. The non-Gaussian distribution substantially increases the lifetime return to postsecondary education, and substantially reduces differences in lifetime wages attributable to race. In a given year, the majority of variance in earnings not accounted for by race, education, and age is due to transitory shocks, but over a lifetime the majority is due to unobserved individual heterogeneity. Consequently, low earnings at early ages are strong predictors of low earnings later in life, even conditioning on observed individual characteristics. Geweke, J.. Computational Experiments and Reality. A common practice in macroeconomics is to assess the validity of general equilibrium models by first deriving their implications for population moments and then comparing population moments with observed sample moments. Generally the population moments are not explicit functions of model parameters, and so computational experiments are used to establish the link between parameters and moments. In most cases the general equilibrium models are intended to describe certain population moments (for example, means) but not others (for example, variances). The comparison of population moments with observed sample moments is informal, a process that has been termed calibration by some economists and ocular econometrics by others. This paper provides a formal probability framework within which this approach to inference can be studied. There are two principle results. First, if general equilibrium models are taken as predictive for sample moments, then the formal econometrics of model evaluation and comparison are straightforward. The fact that the models describe only a subset of moments presents no obstacles, and the formal econometrics yield as a byproduct substantial insights into the workings of models. Second, if general equilibrium models are taken to establish implications for population moments but not sample moments, then there is no link to reality because population moments are unobserved. Under this assumption, atheoretical macroeconomic models that link population and sample moments can be introduced coherently into the formal econometrics of model evaluation and comparison. The result is a framework that unifies general equilibrium models (theory without measurement) and atheoretical econometrics (measurement without theory). The paper illustrates these using some models of the equity premium. Geweke, J. amp Amisano, G. Hierarchical Markov normal mixture models with applications to financial asset returns. With the aim of constructing predictive distributions for daily returns, we introduce a new Markov normal mixture model in which the components are themselves normal mixtures. We derive the restrictions on the autocovariances and linear representation of integer powers of the time series in terms of the number of components in the mixture and the roots of the Markov process. We use the model prior predictive distribution to study its implications for some interesting functions of returns. We apply the model to construct predictive distributions of daily SampP500 returns, dollarpound returns, and one - and ten-year bonds. We compare the performance of the model with ARCH and stochastic volatility models using predictive likelihoods. The model039s performance is about the same as its competitors for the bond returns, better than its competitors for the SampP 500 returns, and much better for the dollar-pound returns. Validation exercises identify some potential improvements. JEL Classification: C53, G12, C11, C14 Geweke, J.. Gowrisankaran, G. amp Town, R. J. Bayesian Inference for Hospital Quality in a Selection Model. This paper develops new econometric methods to infer hospital quality in a model with discrete dependent variables and non-random selection. Mortality rates in patient discharge records are widely used to infer hospital quality. However, hospital admission is not random and some hospitals may attract patients with greater unobserved severity of illness than others. In this situation the assumption of random admission leads to spurious inference about hospital quality. This study controls for hospital selection using a model in which distance between the patient039s residence and alternative hospitals are key exogenous variables. Bayesian inference in this model is feasible using a Markov chain Monte Carlo posterior simulator, and attaches posterior probabilities to quality comparisons between individual hospitals and groups of hospitals. The study uses data on 74,848 Medicare patients admitted to 114 hospitals in Los Angeles County from 1989 through 1992 with a diagnosis of pneumonia. It finds the smallest and largest hospitals to be of high quality and public hospitals to be of low quality. There is strong evidence of dependence between the unobserved severity of illness and the assignment of patients to hospitals. Consequently a conventional probit model leads to inferences about quality markedly different than those in this study039s selection model. Geweke, J.. Horowitz, J. amp Pesaran, M. H. Econometrics: A Birds Eye View. As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semi-parametric and non-parametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledged and attempts have been made to take them into account either by integrating out their effects or by modeling the sources of heterogeneity when suitable panel data exists. The counterfactual considerations that underlie policy analysis and treatment evaluation have been given a more satisfactory foundation. New time series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Non-linear econometric techniques are used increasingly in the analysis of cross section and time series observations. Applications of Bayesian techniques to econometric problems have been given new impetus largely thanks to advances in computer power and computational techniques. The use of Bayesian techniques have in turn provided the investigators with a unifying framework where the tasks of forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process thus paving the way for establishing the foundation of 039real time econometrics. This paper attempts to provide an overview of some of these developments. Selected Peer-Assessed Projects


No comments:

Post a Comment