Science and Technology
Paragraphs

The article introduces the All Minorities at Risk (AMAR) data, a sample of socially recognized and salient ethnic groups. Fully coded for the forty core Minorities at Risk variables, this AMAR sample provides researchers with data for empirical analysis free from the selection issues known in the study of ethnic politics to date. We describe the distinct selection issues motivating the coding of the data with an emphasis on underexplored selection issues arising with truncation of ethnic group data, especially when moving between levels of data. We then describe our sampling technique and the resulting coded data. Next, we suggest some directions for the future study of ethnicity and conflict using our bias-corrected data. Our preliminary correlations suggest selection bias may have distorted our understanding about both group and country correlates of ethnic violence.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Journal of Conflict Resolution
Authors
David Laitin
-

Abstract: As a potential measure of mitigating the contribution of fossil fuel emissions to global warming, carbon dioxide (CO2) capture and storage (CCS) entails capturing CO2 from from large industrial sources, compressing it to a dense supercritical form (scCO2), injecting it deep into suitable reservoirs, and storing it permanently. After 20+ years of research on CCS, including various applied studies involving pilot and demonstration projects, many stakeholders believe that the world is now ready to move from demonstration to industrial-scale implementation. Yet many hurdles remain, ranging from mostly technical nature to economic and public perception issues. This talk provides a broad overview of the decades of research on CO2 and discusses what has been learned versus what challenges remain. The presentation also elaborates on California as an interesting example for the complicated road to deployment at scale, as ambitious climate goals and generous carbon credits should provide for project economics to work, yet no California CCS project has materialized to date.

Speaker Bio: Jens Birkholzer is an internationally recognized expert in subsurface energy applications and environmental impact assessment. He is a Senior Scientist at the Lawrence Berkeley National Laboratory (LBNL, Berkeley Lab) in Berkeley, California, and currently serves as the Director for the Energy Geosciences Division (EGD) at LBNL. He received his Ph.D. in water resources, hydrology, and soil science from Aachen University of Technology in Germany in 1994. Jens joined LBNL in 1994, left for a management position in his native Germany in 1999, and eventually returned to LBNL in 2001. He has over 400 scientific publications, about 125 of which are in peer-reviewed journals, in addition to numerous research reports. He serves as the Associate Editor of the International Journal of Greenhouse Gas Control (IJGGC) and is also on the Board of Editorial Policy Advisors for the Journal of Geomechanics for Energy and Environment (GETE). Jens leads the international DECOVALEX Project as its Chairman, is a Fellow of the Geological Society of America, and serves as a Senior Fellow of the California Council on Science and Technology.

William J. Perry Conference Room

Encina Hall, 2nd floor

616 Serra Street

Stanford, CA 94305

Jens Birkholzer Lawrence Berkeley National Laboratory
Seminars
Paragraphs

The Stata package krls as well as the R package KRLS implement kernel-based regularized least squares (KRLS), a machine learning method described in Hainmueller and Hazlett (2014) that allows users to tackle regression and classification problems without strong functional form assumptions or a specification search. The flexible KRLS estimator learns the functional form from the data, thereby protecting inferences against misspecification bias. Yet it nevertheless allows for interpretability and inference in ways similar to ordinary regression models. In particular, KRLS provides closed-form estimates for the predicted values, variances, and the pointwise partial derivatives that characterize the marginal effects of each independent variable at each data point in the covariate space. The method is thus a convenient and powerful alternative to ordinary least squares and other generalized linear models for regression-based analyses.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Journal of Statistical Software
Authors
-

Abstract: Artificial intelligence (AI) is rapidly improving. The opportunities are tremendous, but so are the risks. Existing and soon-to-exist capabilities pose several plausible extreme governance challenges. These include massive labor displacement, extreme inequality, an oligopolistic global market structure, reinforced authoritarianism, shifts and volatility in national power, and strategic instability. Further, there is no apparent ceiling to AI capabilities, experts envision that superhuman capabilities in strategic domains will be achieved in the coming four decades, and radical surprise breakthroughs are possible. Such achievements would likely transform wealth, power, and world order, though global politics will in turn crucially shape how AI is developed and deployed. The consequences are plausibly of a magnitude and on a timescale to dwarf other global concerns, leaders of governments and firms are asking for policy guidance, and yet scholarly attention to the AI revolution remains negligible. Research is thus urgently needed on the AI governance problem: the problem of devising global norms, policies, and institutions to best ensure the beneficial development and use of advanced AI.

This problem can be broken into three complementary research clusters:

  1. The technical landscape: What are the trends and possibilities in AI capabilities? What are their likely consequences? What are the externalities from AI, and how can they best be addressed?
  2. AI politics: Who are the relevant actors, what are their interests, and what can they do? What is the nature of the conflict and cooperation challenges that they are likely to face? How can they overcome dangerous conflictual dynamics, in particular an international arms race?
  3. AI governance: Given our understanding of the technical landscape and AI politics, what options are available to us for global governance of AI and what should we work towards?

 

Work on the AI governance problem must draw on the full body of social science and policy expertise. Solutions are needed by an unknown, but plausibly impending, deadline.

Speaker Bio: Allan Dafoe is an Assistant Professor of Political Science at Yale University and a Research Associate at the Future of Humanity Institute, University of Oxford. His research seeks to understand the causes of world peace and stability. Specifically, his research has examined the causes of the liberal peace, and the role of reputation and honor as motives for war. He develops methodological tools and approaches to enable more transparent, credible causal inference. Allan is beginning research on the international politics of transformative artificial intelligence.

William J. Perry Conference Room

Encina Hall, 2nd floor

616 Serra Street

Stanford, CA 94305

Allan Dafoe Assistant Professor of Political Science Yale University
Seminars
-
This event is co-sponsored by the Stanford Silicon Valley-New Japan Project and the Japan Society of Northern California.

When the Fukushima Daiichi Nuclear Power Plant experienced a meltdown after the Great East Japan Earthquake in March 2011, people scrambled to get accurate data on radiation. Geiger counters were suddenly a hot commodity. In that moment of crisis, a group of global citizens rose to the occasion to launch Safecast, an open data platform to track, monitor and share data on the radiation levels in Fukushima and throughout Japan. Safecast, a Japan Earthquake Relief Fund grantee, enlisted the help of volunteers who collected the data from all over Japan, and even built its own DIY Geiger counter kit. The Japan Society of Northern California and the Stanford Silicon Valley-New Japan Project are proud to present a program with Pieter Franken, the Co-Founder of Safecast, will look back at Safecast’s evolution—a prime example of citizen science embracing open data and open source—over the last six years and their plans to expand their data gathering efforts to take on new environmental challenges. 

Bio

Image
Pieter Franken's career spans over 25 years in Financial Services, specializing in O&T, Fintech, innovation and large-scale transformations. He has held C-level and executive positions with industry leaders such as Citigroup, Shinsei Bank, Aplus, Monex Securities and Monex Group. His hallmark is pioneering innovative services by implementing bleeding edge technologies while minimizing time-to-market and dramatically reducing costs. Versed in large scale IT transformation, bi-modal management, innovation, software development, datacenter operations, financial operations and FinTech, he is a much looked after advisor and speaker on a wide range of topics and is known for providing deep insights pulling from is wide experience in IT, financial services and innovation management. 

Pieter currently is Senior Advisor at Monex Group (a leading online securities and financial services company in Japan) where he focuses on the Future of Financial Services, Group IT Strategy, Fintech, and Blockchain. 

He is also a member of Monetary Authority of Singapore (MAS) International Technology Advisory Panel (ITAP) where he contributes in the transformation of Singapore as a leading Fintech Hub. In 2011 Pieter co-founded Safecast.org - a global volunteer initiative to collect citizen sourced environmental data. Pieter also advises startups, such as ModuleQ, an AI startup based in Silicon Valley. Pieter holds a MSc in Computer Science from Delft University (The Netherlands) and currently is a researcher with MIT Media Lab (US) and Keio University (Japan) where he contributes to the advancement in IoT, Digital Currencies, Block-chain technologies and Citizen Science. Pieter is based in Japan and frequently travels across Asia, North America and Europe.

Agenda

4:15pm: Doors open
4:30pm-5:30pm: Talk and Discussion
5:30pm-6:00pm: Networking

RSVP Required

 
For more information about the Silicon Valley-New Japan Project please visit: http://www.stanford-svnj.org/

 

Pieter Franken, Senior Advisor, Monex Group
Seminars
Paragraphs

Large-scale crop monitoring and yield estimation are important for both scientific research and practical applications. Satellite remote sensing provides an effective means for regional and global cropland monitoring, particularly in data-sparse regions that lack reliable ground observations and reporting. The conventional approach of using visible and near-infrared based vegetation index (VI) observations has prevailed for decades since the onset of the global satellite era. However, other satellite data encompass diverse spectral ranges that may contain complementary information on crop growth and yield, but have been largely understudied and underused. Here we conducted one of the first attempts at synergizing multiple satellite data spanning a diverse spectral range, including visible, near-infrared, thermal and microwave, into one framework to estimate crop yield for the U.S. Corn Belt, one of the world's most important food baskets. Overall, using satellite data from various spectral bands significantly improves regional crop yield predictions. The additional use of ancillary climate data (e.g. precipitation and temperature) further improves model skill, in part because the crop reproductive stage related to harvest index is highly sensitive to environmental stresses but they are not fully captured by the satellite data used in our study. We conclude that using satellite data across various spectral ranges can improve monitoring of large-scale crop growth and yield beyond what can be achieved from individual sensors. These results also inform the synergistic use and development of current and next generation satellite missions, including NASA ECOSTRESS, SMAP, and OCO-2, for agricultural applications.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Remote Sensing of Environment
Authors
David Lobell
Paragraphs
Using agricultural and economic characteristics in African nations as test cases, new research by David Lobell and Marshall Burke demonstrates the use of satellite data to address the long-standing problem of accurate data collection in developing countries. An often cited challenge in achieving development goals aimed at poverty and hunger reduction is the lack of reliable on-the-ground data. Limited or insuffiient data makes it difficult to establish baseline conditions and to assess effectiveness of various aid programs. In the past, researchers and policymakers had to rely on ground surveys, which are expensive, time-consuming, and rarely conducted. This has led to large data gaps in mapping sustainable development goal progress, such as in agricultural and poverty statistics.
 
This brief is based on findings from the papers “Satellite-based assessment of yield variation and its determinants in smallholder African systems,” published in Proceedings of the National Academy of Sciences in 2017 and “Combining satellite imagery and machine learning to predict poverty,” published in Science in 2016.
All Publications button
1
Publication Type
Policy Briefs
Publication Date
News Type
News
Date
Paragraphs

On March 30 and 31, 2017, Stanford held two events at SCPKU featuring the latest developments in quantitative finance and financial technology. 

On March 30, the university co-organized with SCPKU, Tsinghua University’s School of Economics and Management and the Department of Mathematics, and Peking University’s  (PKU) Guanghua School of Management and Department of Financial Mathematics, a conference featuring new developments in quantitative finance and risk management with a particular emphasis on trade execution, financial technology, data analysis, and insurance.   This event was the third biennial conference following previous ones at PKU in 2013 and Tsinghua in 2015. Following opening remarks by Stanford Professor of Statistics and Director of Stanford's Financial and Risk Modeling Institute (FARM) Tze Lai, experts from academia and industry including J.P. Morgan, PKU, Tsinghua, Renmin University of China, Daokoudai and the Southwest University of Finance and Economics in Chengdu, shared the latest developments in a wide spectrum of quantitative finance topics ranging from conditional quasi-Monte Carlo methods to China’s peer-to-peer lending market. 

FARM and SCPKU also co-organized a forum on financial technology and portfolio management on March 31.  Due to advances in artificial intelligence and big data technologies, the financial industry is facing tremendous pressure to develop and implement solutions yielding improved operational efficiencies.  This forum convened distinguished academic and industry speakers from quantitative trading, wealth management, asset management, financial consulting, and credit rating firms and agencies to explore the current development and future for financial technology and portfolio management.

Hero Image
822898726
Stanford Professor of Statistics Tze Lai (center, seated) and financial forum speakers at SCPKU.
Stanford University
All News button
1
Paragraphs

It is a global story, a new industrial revolution. The spread of the internet and the proliferation of social media have led to dramatic changes with salutary results: greater access to more diverse information, gateways to goods and services that have transformed the retail experience, and opportunities to engage and network with expanded communities, while still staying in touch with friends and family, all thanks to the blessings of these new technologies.

All Publications button
1
Publication Type
Commentary
Publication Date
Journal Publisher
The Caravan
Authors
Russell A. Berman
Subscribe to Science and Technology