-

Frank Wolak discusses restructuring of the electricity industry in the U.S. using examples from California and explains the problems involved in energy market design.

Building 420, Room 40

Stanford University 
Economics Department 
579 Jane Stanford Way Stanford, CA 94305-6072 

Website: https://fawolak.org/

(650) 724-1712 (650) 724-1717
0
Senior Fellow at the Freeman Spogli Institute for International Studies
Holbrook Working Professor of Commodity Price Studies in Economics
Senior Fellow, by courtesy, at the Stanford Institute for Economic Policy Research
frank_wolak_033.jpg MS, PhD

Frank A. Wolak is a Professor in the Department of Economics at Stanford University. His fields of specialization are Industrial Organization and Econometric Theory. His recent work studies methods for introducing competition into infrastructure industries -- telecommunications, electricity, water delivery and postal delivery services -- and on assessing the impacts of these competition policies on consumer and producer welfare. He is the Chairman of the Market Surveillance Committee of the California Independent System Operator for electricity supply industry in California. He is a visiting scholar at University of California Energy Institute and a Research Associate of the National Bureau of Economic Research (NBER).

Professor Wolak received his Ph.D. and M.S. from Harvard University and his B.A. from Rice University.

Director of the Program on Energy and Sustainable Development
Date Label
Frank Wolak Speaker
Seminars
Authors
Rosamond L. Naylor
News Type
Commentary
Date
Paragraphs
In an Op-Ed featured on Huffington Post, aquaculture specialist and FSE director, Rosamond Naylor, supports a newly proposed House bill, the National Sustainable Offshore Aquaculture Act. The bill addresses the potential threats of poorly regulated, intensive fish farming in U.S. ocean waters, and ensures that U.S. aquaculture adopts a science-based, precautionary approach to protect our ocean ecosystems, fishing communities and seafood consumers.

With all eyes on the climate deliberations in Copenhagen, it is more important than ever to find innovative ways of reducing agriculture's contribution to global climate change. The livestock industry in particular has helped feed the world but at a significant cost to the environment, including generating large emissions of greenhouse gas.

One promising solution is to substitute fish production for meat production. But to do so we must ensure that the "blue revolution" in ocean fish farming does not lead to the same suite of environmental problems that have accompanied the "green revolution" for land-based agriculture. Americans' appetite for fish continues to grow and is increasingly met by a year-round supply of fresh fish imported into our marketplace. Yet few Americans know where their fish comes from or how it was produced. Just as most chickens, pigs and cows are raised in tightly confined, intensive operations, so too are many fish.

Right now in the United States we have an opportunity to help ensure that the emerging marine aquaculture sector meets both human and environmental needs. This week, Rep. Lois Capps (D-Calif.) will introduce in the House of Representatives a bill called the National Sustainable Offshore Aquaculture Act that addresses the potential threats of poorly regulated fish farming in U.S. ocean waters. These threats include spread of disease and parasites from farmed to wild fish; discharge of effluents into surrounding waters; misuse of antibiotics and other pharmaceuticals and chemicals; escape of farmed fish into wild fish habitat; killing of marine mammals and sharks that might prey on ocean farm cages; and reliance on use of wild-caught fish in aquaculture feeds, which could deplete food supplies for other marine life and the aquaculture industry itself over time.

These environmental impacts have been evident in many other countries with intensive marine fish farming. The recent collapse of salmon aquaculture in Chile, where industry expansion was prioritized over environmental protection, is the most glaring example. Salmon, one of Chile's leading exports, has suffered a major blow as a result of poor regulation and environmentally unsound management. Tens of thousands of people are now jobless in southern Chile, where the salmon farming industry once boomed.

There are three critical points to be made about the Capps bill. First, unlike previous attempts to legislate on fish farming at the national level, the bill would ensure that U.S. aquaculture adopts a science-based, precautionary approach that establishes a priority for the protection of wild fish and functional ecosystems. This approach is consistent with President Obama's recent call to develop a comprehensive and integrated plan to manage our ocean's many competing uses to ensure protection of vital ecosystem services in years to come.

Second, the Capps bill would preempt the emergence of ecologically risky, piecemeal regulation of ocean fish farming in different regions of the U.S. Efforts are already afoot in Hawaii, California, the Gulf of Mexico and New England to expand marine aquaculture without consistent standards to govern their environmental or social performance. If these piecemeal regional initiatives move forward, there will be little hope of creating a sustainable national policy for U.S. open-ocean aquaculture.

Finally, the Capps bill as currently written has a solid, long-term vision for the appropriate role of fish farming in sustainable ocean ecosystems and thus should win widespread support among environmental and fishing constituencies. It should also garner support from the more progressive end of the aquaculture industry that aspires to sustainable domestic fish production.

Previous federal bills introduced in 2005 and 2007 were fundamentally flawed -- and thus rightly criticized -- because they put the goal of aquaculture expansion far above that of environmental protection. Now, for the first time, a bill has been introduced that would demonstrably protect our ocean ecosystems, fishing communities and seafood consumers from the risks of poorly regulated open-ocean aquaculture.

Rep. Capps and her colleagues are to be commended. Now is the time for the new leadership in Washington -- at the White House and at the National Oceanic and Atmospheric Administration -- to embrace this more science-based and precautionary approach to ensure a sustainable future for U.S. ocean aquaculture.

Hero Image
US rig offshor
All News button
1
-

Since the 2001 anthrax attacks, members of the biosecurity community and US government officials have expressed a growing sense of alarm at the threat of a biological attack.  The Commission on the Prevention of Weapons of Mass Destruction Proliferation and Terrorism recently predicted that a terrorist attack involving WMD is likely to take place by 2013 and identified biological terrorism as the most likely contingency.  To counter this threat, increasing emphasis has been placed on the role of microbial forensics in deterring an attack. New infrastructure has been established by the US government to develop capabilities to identify the source of a pathogen used in an attack and identify the perpetrators. However, many open questions remain about the potential efficacy of this approach both from a technological capabilities standpoint and from a deterrence perspective.

Existing technologies can be borrowed from molecular biology to identify elements in a pathogen's DNA, which could help investigators trace it back to a specific source strain. However, these tools are limited, and new methods should be developed to increase confidence in microbial forensics analyses. Moreover, a comprehensive genome database of pathogen strains is necessary for an effective investigation in the event of an attack. Who will cover the costs of sequencing pathogen genome strains to generate such a database? Will there be obstacles to gaining cooperation from academic and government facilities within the United States and internationally?  In the best-case scenario, advances in microbial forensics could enable us to identify the source of a biological attack; would these capabilities effectively deter non-state actors? These questions must be addressed to determine the extent to which microbial forensics programs can meet their stated goals.

Jaime Yassif is a doctoral candidate in the Biophysics Group at UC Berkeley. She is conducting her thesis research in the Liphardt lab, where she studies the dynamics of RNA-binding proteins using a single-molecule technique called plasmon rulers.

Prior to her graduate work, Ms. Yassif worked for several years in science and security policy and arms control.  She began as a research assistant at the Federation of American Scientists, where she contributed to the writing of Senate Foreign Relations Committee testimony on radiological weapons and authored a piece on radiological decontamination in Defense News. She then worked as a program officer at the Nuclear Threat Initiative, where she provided support for the organization's four key program areas-Russia/New Independent States, Biological, Regional and Communications-and managed the organization of an international workshop on Global Best Practices in Nuclear Materials Management. This was followed by a fellowship to study the Chinese nuclear posture at Tsinghua University in Beijing.

Ms. Yassif holds an MA in Science and Security from the War Studies Department at King's College London, where she wrote her thesis on verification of the Biological Weapons Convention.  She received her bachelor's degree in Biology from Swarthmore College. Ms. Yassif is former president of the student-run Science, Technology and Engineering Policy group at UC Berkeley and a member of Women in International Security.

Martha Crenshaw is a senior fellow at the Center for International Security and Cooperation and the Freeman Spogli Institute, and professor of political science (by courtesy). Her current research focuses on why the United States is the target of terrorism, the effectiveness of counter terrorism policies, and mapping terrorist organizations. Professor Crenshaw served on the Executive Board of Women in International Security and chaired the American Political Science Association (APSA) Task Force on Political Violence and Terrorism. She was a Guggenheim Fellow in 2005-2006. Her edited book, The Consequences of Counterterrorism in Democracies, is being published by the Russell Sage Foundation.

Reuben W. Hills Conference Room

Jaime Yassif PhD candidate, UC Berkeley Biophysics Graduate Group Speaker
Martha Crenshaw Professor of Political Science (by courtesy) and Senior Fellow at CISAC and FSI Commentator
Seminars
-

Nonproliferation efforts have traditionally focused on controlling supply of proliferation-relevant technology, expertise, and material. As barriers to diffusion of all three have been lowered, there is increased acknowledgement of the need to reduce demand for such weapons, and, in cases where efforts to prevent proliferation have failed, the need to develop effective international responses. However, with few exceptions, approaches to nonproliferation have not changed qualitatively in the last 40 years. This research explores the concept of resilience as understood for other complex interactive systems, extracts key features, and applies them to nonproliferation. In addition, it examines unintended consequences of traditional nonproliferation strategies and feedbacks among them.  Based on insights gained from this exercise, a new analytical framework for nonproliferation will be proposed.

Arian L. Pregenzer is a 2009-2010 CISAC visiting scholar and a Senior Scientist in the Global Security Program at Sandia National Laboratories in Albuquerque, New Mexico. She is responsible for initiating new programs in arms control and nonproliferation and for developing strategies for international engagement for multiple laboratory programs. In addition, she provides leadership for Sandia's efforts to integrate across nuclear weapons, arms control, and nonproliferation missions to effectively meet nuclear security challenges.

Most recently, Dr. Pregenzer has focused on near-term steps that can enhance nuclear security while advancing the goals of NPT Article VI. She is particularly interested in how international technical cooperation on topics such as verification methods for nuclear arms control, nuclear weapons security and accountability, and nuclear fuel cycle management can establish the technical basis for moving toward a world without nuclear weapons.

Dr. Pregenzer has bachelors' degrees in physics, mathematics, and philosophy from the University of New Mexico, and a Ph.D. in theoretical condensed matter physics from the University of California at San Diego. Prior to her career in international security, she worked at Sandia to develop lithium ion sources for particle-beam-driven inertial confinement fusion.

Reuben W. Hills Conference Room

Arian Pregenzer CISAC Visiting Scholar Speaker
Seminars
Authors
Jeremy Carl
News Type
Commentary
Date
Paragraphs
Jeremy Carl argues that despite India’s lack of a concrete binding target for significant CO2 emissions reductions, India’s climate commitments come through on other fronts.

Sometimes in diplomacy what is not announced is more revealing than what is. Such is certainly the case in India's recent climate and energy negotiations with the US, as both countries prepare to head to global climate talks in Copenhagen. The occasion of Manmohan Singh's state visit to the US brought the announcement of a flurry of energy and climate-related initiatives. These initiatives were a combination of substance and political theatre, with potentially important initiatives on environmental and regulatory capacity-building and technology partnerships buried under a deep layer of bureaucratic niceties.

What was more noticed was what was not announced: any agreement for India to have a binding target for CO2 emissions reductions, something US and European environmentalists have long claimed is necessary as part of a global effort to stave off severe climate change. And while the Indian government has eventually announced a targeted reduction in what is known as "emissions intensity", CO2 emissions per unit of GDP, that wasn't a big stretch, given India's current annual efficiency improvements. Furthermore, Minister for Environment and Forests Jairam Ramesh has made it abundantly clear in Parliament that such targets would be voluntary and not part of a binding international agreement.

With more than 60 world leaders in attendance, we can be assured that Copenhagen will not end in public failure. But the better question is whether the announced success in Copenhagen will have any practical meaning other than determining that diplomats can spin a "success" out of any actual events. Some Indian commentators have seemed to hope for a "success" of that sort - fretting about India being outmanoeuvred on the public stage by China and other developing countries that may be able to strike a more cooperative posture.

While from a tactical standpoint, such concerns are understandable (there is little reason for India to not commit to doing things it would like to do anyway, such as developing more efficient power plants or cars), from the perspective of actually taking leadership in addressing the climate problem, they mean little. In some ways, India is emulating the example of the US from the previous Kyoto climate round: while the US certainly should have been more proactive and engaged, at least the Americans had the integrity not to ratify an agreement that they couldn't keep. Many other nations could not claim that; they either missed their targets entirely, or resorted to bogus accounting tricks to meet their goals.

That India is showing its seriousness by not making climate commitments it won't live by should actually be seen as a mature and responsible decision, not an intransigent one. Does anyone think that China won't walk away from its promise if they have trouble meeting their emissions reduction goals?

As an alternative to the hot air that is likely to come out of Copenhagen, it is instructive to look at the potentially useful energy and climate agreements the US and India did sign during the PM's recent visit. The fact that clean energy was the second item listed behind security issues in the joint communiqué announced by Singh and Obama is clear evidence that both India and the US place a high importance on this aspect of their relationship.

India and the US announced numerous programmes, from the joint deployment of solar electricity in Indian cities to the strengthening of India's environmental regulatory and monitoring capacity - which is sure to be a critical step if India is to make serious and verifiable long-term commitments to emissions reductions. Perhaps most important, at least symbolically, was the announcement of joint scientific R&D work for renewable energy technologies. The Indo-US Clean Energy Research and Deployment Initiative, which promises joint development of new energy technologies and the development of a joint research centre with a public-private funding model, is one such initiative.

Ultimately, despite the bluster of diplomats in Delhi, Washington or Copenhagen, the solutions to the climate change problem must come through a technological revolution in the world's energy infrastructure. And it is here that India, with its burgeoning corps of bright young engineers, could make the biggest impact on climate change mitigation. Circumstances may not permit

India to lead the deal-making in Denmark, but if the Indian government gets serious about turning more of India's brightest young minds towards solving the clean energy problem, then India's contribution to solving the climate change conundrum may be significant indeed.

All News button
1

CDDRL
Stanford University
Encina Hall
616 Serra Street
Stanford, CA 94305-6055

0
Visiting Scholar 2010
Schmitter.JPG PhD

Philippe C. Schmitter is a visiting scholar at CDDRL during winter quarter 2010. Since 1967 he has been successively assistant professor, associate professor and professor in the Politics Department of the University of Chicago, then at the European University Institute (1982-86) and at Stanford (1986-96). He was Professor of Political Science at the European University Institute in Florence, Department of Political and Social Sciences until September 2004. He is now Emeritus of the Department of Political and Social Sciences at the European University Institute.

He has been visiting professor at the Universities of Paris-I, Geneva, Mannheim and Zürich, and Fellow of the Humboldt Foundation, Guggenheim Foundation and the Palo Alto Centre for Advanced Studies in the Behavioral Sciences.

He has published books and articles on comparative politics, on regional integration in Western Europe and Latin America, on the transition from authoritarian rule in Southern Europe and Latin America, and on the intermediation of class, sectoral and professional interests.

His current work is on the political characteristics of the emerging Euro-polity, on the consolidation of democracy in Southern and Eastern countries, and on the possibility of post-liberal democracy in Western Europe and North America.
Recently, Professor Schmitter was awarded the The Johan Skytte Prize in political science (2009).

He earned his PhD from the University of California at Berkeley.

-

Who should decide how users can use the Internet? users or network providers? Should network providers be allowed to block certain applications or content on their networks? Should they be allowed to offer different classes of service to applications or content, and, if yes, whom should they be allowed to charge for this service? And should the answer to these questions differ depending on whether a network provider engages in these practices to manage bandwidth on its network?

Triggered by changes in Internet technology, these questions over network neutrality have moved to the center of the regulatory and legislative debates surrounding the Internet worldwide. They are at the core of the Open Internet Proceeding, launched by the Federal Communications Commission in October 2009 to explore what rules are needed to secure the Internet's openness. The talk will give an overview of the draft rules proposed by the Federal Communications Commission and explain how the alternative options under consideration would affect the environment for political speech in the United States.

Barbara van Schewick's research focuses on the economic, regulatory, and strategic implications of communication networks. In particular, she explores how changes in the architecture of computer networks affect the economic environment for innovation and competition on the Internet, and how the law should react to these changes. This work has made her a leading expert on the issue of network neutrality.Her book "Internet Architecture and Innovation" will be published by MIT Press this spring.

Professor van Schewick is the Faculty Director of Stanford Law School's Center for Internet and Society and an assistant professor of electrical engineering (by courtesy) at Stanford's Department of Electrical Engineering.

Prior to joining the Stanford Law faculty, van Schewick was a senior researcher at the Technical University Berlin, Germany, and a nonresidential fellow of the Center for Internet and Society. Van Schewick has advised the German Federal Ministry of Education and Research on innovation and technology policy and worked with the German Federal Network Agency on spectrum policy. From August 2000 to November 2001, she was the first residential fellow at the Center for Internet and Society.

Summary of the Seminar
Barbara van Schewick, Assistant Professor at the Stanford Law School, introduced the current debate about net neutrality and explored the implications for diversity and freedom of expression online.

Network providers were at one time ‘application blind' - they were unable to see what was contained in the data packets that allow information to be transmitted online. Now that this is no longer the case, a debate has emerged about the role for regulation in controlling the ability of network providers to block or interfere with applications. What was drawn up as a voluntary policy statement is now being considered and revised by the FCC's Open Internet Proceeding.

Blocking of applications is problematic on several counts. First, there may be incentives for network providers to block applications that threaten their own profitability (for example, Skype). This leads to a situation where the success of applications is no longer decided on user criteria and the overall value created for society diminishes. Second, the great promise of the internet is that it removes traditional gatekeepers (such as mass media outlets) to speech. This is undermined if network providers have the ability to control what content users see. This is particularly problematic since users cannot easily switch to another provider as they could if a particular store did not carry a product they wanted. The cost of switching makes this impractical and in places without a choice of providers, this is not an option.

In drawing up regulation against blocking the FCC is debating a number of related issues:

Discrimination: Even if blocking is prohibited, discriminating between levels of service can still allow network providers to slow down an application to the extent that it becomes un-useable. This is actually a more effective tool than blocking since it is much harder to detect. Users may attribute slow speeds to poor design and potentially useful applications will fail to get traction.                                                                                                                                                                                                                                                                                                                                                                                                                              

Charges for different levels of service: Even if we agree network providers should not discriminate between the services they provide in an arbitrary way, could they offer improved service for payment? Opponents argue that this policy would be bad for competition since new developers would be unable to pay for the levels of service that established players could afford. And it would threaten the ability of poorly resourced minority voices - e.g. small NGOs and publications - to get heard.

Exceptions to discrimination: Network providers argue that there needs to be some discrimination to allow them to undertake reasonable network management. But it is difficult to determine what counts as reasonable management. One concern is that peer to peer networks - which allow those without many resources to exchange material cheaply - might be targeted in particular, since they can create a lot of congestion. This might also threaten the ability of new applications with high bandwidths to get funding, since the risk of being slowed down by the networks would be perceived to be too high by investors.

Many of the major benefits of the internet - the ease of publishing and coordinating, for example - are only possible through applications. Hence the outcome of this debate will have serious implications for the future social and political impact of the internet. 

Wallenberg Theater

Barbara van Schewick Assistant Professor of Law Speaker Stanford Law School
Seminars
-

Abstract
One of the biggest themes of the 21st century is interconnection -- specifically, the interconnection of people and data.  These interconnections can change everything about how we see the world, how the world sees us, and how we work together.  Where some people might see "big brother," I see empowerment -- empowerment of groups and individuals to improve quality of life and reduce our impact on the planet. 

Megan Smith oversees teams that manage early-stage partnerships, explorations and technology licensing. She also leads the Google.org team, guiding strategy and developing new partnerships and internal projects with Google's engineering and product teams. She joined Google in 2003 and has led several of the company's acquisitions, including Keyhole (Google Earth), Where2Tech (Google Maps), and Picasa. She also co-led the company's early work with publishers for Google Book Search. Previously, Megan was the CEO and, earlier, COO of PlanetOut, the leading gay, lesbian, bisexual and transgender online community. Under her leadership, PlanetOut grew tenfold in reach and revenue. Prior to that, Megan was at General Magic for six years working on handheld communications products and partnerships. She also worked in multimedia at Apple Japan in Tokyo.

Over the years, Megan has contributed to a wide range of engineering projects, such as designing an award-winning bicycle lock; working on a space station construction research project that eventually flew on the U.S. space shuttle; and running a field-research study on solar cookstoves in South America. She was also a member of the MIT-Solectria student team that designed, built, and raced a solar car in the first cross-continental solar car race, covering 2000 miles of the Australian outback. She was selected as one of the 100 World Economic Forum technology pioneers for 2001 and 2002.

Megan holds a bachelor's degree and a master's degree in mechanical engineering from MIT, where she now serves on the board. She completed her master's thesis work at the MIT Media Lab.

Summary of the Seminar
Megan Smith, Vice President, New Business Development and General Manager, Google.org., argued that greater interconnectedness achieved by information technology is a major liberating force in the world. Whether it is aiding the coordination of protests or increasing transparency of governments, the exchange of information has huge benefits. This is not a new phenomenon. In places where people have been able to exchange information easily, social progress has followed. Megan cited the example of Seneca Falls, New York where the canal system allowed for extensive communication; it became significant in both the women's rights and abolition movements.

While a large proportion of the world is benefiting from greater interconnectedness, Africa still lacks the infrastructure to take full advantage. Submarine fiber optic cables are necessary for quick and cheap internet cables and many African countries, particularly in the east, are not connected to these, relying instead on satellites. This is likely to change over the next few years, bringing great potential for further development.

The mission of Google.org is to use technology to drive solutions to global challenges such as climate change, pandemic disease and poverty. The organization was set up as part of a commitment to devote approximately one percent of Google's equity plus one percent of annual profits to philanthropy, along with employee time.  Google.org now places its strategic focus on those projects that can leverage the resources of Google staff, particularly its engineers.

Current projects that harness the power of information include:

  • Google Flu Trends: This uses aggregated Google search data to estimate flu activity up to two weeks earlier than traditional methods. This system has almost 90% accuracy in real time flu prediction and is therefore an extremely useful tool for health delivery agencies. It is now being used in 30 countries. Google is also starting to work in Cambodia to collect data around SARS.
  • Google Power Meter provides a system for consumers to understand their in-home energy use and to take steps to reducing this. The Meter receives information from utility smart meters and in-home energy management devices and visualizes this information on iGoogle (a personalized Google homepage).The premise underlying this project is that greater information is going to be crucial to tackling climate change and consumers ought to be able to be empowered to make informed decisions about their energy use.
  • Disaster relief: In response to the Haitian earthquake, a team of engineers worked with the U.S. Department of State to create an online People Finder gadget so that people can submit information about missing persons and to search the database. Google Earth satellite images have also been used to document the extent of damage.

Wallenberg Theater

Megan Smith Vice President, New Business Development, and General Manager Speaker Google.org
Seminars

Some 700,000 Koreans, 40,000 Chinese and 35,000 Allied POWs performed forced labor for private companies within Japan during the Asia Pacific War. Kyushu coal mines were a wartime center of this injustice and Fukuoka is a major locus of ongoing redress efforts, which the presenter has closely observed. A front-row account of the interaction between community activists in Japan, Korea, China and North America will be provided and key results will be discussed. The Japanese government has been prodded into sending the remains of Korean labor conscripts to South Korea and handing over the long-suppressed records that Seoul needs to fully implement its own compensation program. Lawsuits in Japanese courts stemming from forced labor by Chinese proved partially successful, raising expectations that more Japanese firms may voluntarily settle the especially strong Chinese claims. Amid the controversy surrounding former Prime Minister Aso's admission that there were POWs at Aso Mining, Japan issued new official apologies and is expanding a POW reconciliation program. Fluid networks of independent researchers and Internet-empowered activists continue to influence developments within Japan's changing political landscape. This transnational grassroots activism also faces barriers and limitations.

Mr. Underwood's doctoral research at Kyushu University analyzed the reparations movement for Chinese forced labor in Japan during World War Two, locating it within the global trend toward repairing historical injustices. His articles for The Asia-Pacific Journal: Japan Focus (www.japanfocus.org) provide the fullest descriptions of forced labor redress activities involving Chinese as well as Korean victims. He played a key role in forcing former Japanese Prime Minister Aso Taro to admit there were Allied POWs at Aso Mining during the war. His Web site is www.williamunderwood.org.

Philippines Conference Room

William Underwood Speaker Independent Researcher
Seminars
-

This lecture summarizes the argument of a forthcoming book (Suhrkamp, Princeton University Press) that Stalin's crimes of the 1930s should be considered genocide. This requires a review of historical/legal concepts of genocide and of the mass killing of the period itself.

Norman Naimark is the Robert and Florence McDonnell Professor of East European Studies: a professor of history; core faculty member of FSI's Forum on Contemporary Europe; and an FSI senior fellow by courtesy. He is an expert on modern East European, Balkan, and Russian history. His current research focuses on the history of genocide in the 20th century and on postwar Soviet policy in Europe. He is author of the critically acclaimed volumes: The Russians in Germany: The History of the Soviet Zone of Germany, 1945-1949 (Harvard 1995) and Fires of Hatred: Ethnic Cleansing in 20th Century Europe (Harvard 2001).  Most recently, he has co-edited books on Yugoslavia and its Historians (Stanford 2003), Soviet Politics in Austria, 1945-1955: Documents from the Russian Archives (in German and Russian, Austrian Academy of Sciences, 2006),  and The Lost Transcripts of the Politburo (Yale 2008). 

Naimark is a senior fellow by courtesy of the Hoover Institution and Burke Family Director of the Bing Overseas Studies Program at Stanford. He also was chair of Stanford's Department of History and programs in International Relations and International Policy Studies. He has served on the editorial boards of a series of leading professional journals, including: The American Historical Review, The Journal of Modern History, Slavic Review, and East European Politics and Societies. He served as President of the American Association for the Advancement of Slavic Studies (1997) and as chairman of the Joint Committee on Eastern Europe of the American Council of Learned Societies and Social Science Research Council (1992-1997). 

Before joining the Stanford faculty, Naimark was a professor of history a Boston University and a fellow of the Russian Research Center at Harvard. He also held the visiting Catherine Wasserman Davis Chair of Slavic Studies at Wellesley College. He has been awarded the Officer's Cross of the Order of Merit of the Federal Republic of Germany (1996), the Richard W. Lyman Award for outstanding faculty volunteer service (1995), and the Dean's Teaching Award from Stanford University for 1991-92 and 2002-3.

This event marks the Stanford inauguration of the series developed with the Forum on Contemporary Europe at FSI, in partnership with Suhrkamp Verlag.  

The series is also supported by the Division of Humanities and Sciences,the Stanford Humanities Center, Department of Literatures, Cultures, and Languages, and the German Stanford Club.


Levinthal Hall

CISAC
Stanford University
Encina Hall, C235
Stanford, CA 94305-6165

(650) 723-6927 (650) 725-0597
0
Senior Fellow, by courtesy, at the Freeman Spogli Institute for International Studies
Robert & Florence McDonnell Professor of East European Studies
Professor of History
Professor, by courtesy, of German Studies
Senior Fellow at the Hoover Institution
Naimark,_Norman.jpg MS, PhD

Norman M. Naimark is the Robert and Florence McDonnell Professor of East European Studies, a Professor of History and (by courtesy) of German Studies, and Senior Fellow of the Hoover Institution and (by courtesy) of the Freeman-Spogli Institute for International Studies. Norman formerly served as the Sakurako and William Fisher Family Director of the Stanford Global Studies Division, the Burke Family Director of the Bing Overseas Studies Program, the Convener of the European Forum (predecessor to The Europe Center), Chair of the History Department, and the Director of Stanford’s Center for Russian, East European, and Eurasian Studies.

Norman earned his Ph.D. in History from Stanford University in 1972 and before returning to join the faculty in 1988, he was a professor of history at Boston University and a fellow of the Russian Research Center at Harvard. He also held the visiting Catherine Wasserman Davis Chair of Slavic Studies at Wellesley College. He has been awarded the Officer's Cross of the Order of Merit of the Federal Republic of Germany (1996), the Richard W. Lyman Award for outstanding faculty volunteer service (1995), and the Dean's Teaching Award from Stanford University for 1991-92 and 2002-3.

Norman is interested in modern Eastern European and Russian history and his research focuses on Soviet policies and actions in Europe after World War II and on genocide and ethnic cleansing in the twentieth century. His published monographs on these topics include The History of the "Proletariat": The Emergence of Marxism in the Kingdom of Poland, 1870–1887 (1979, Columbia University Press), Terrorists and Social Democrats: The Russian Revolutionary Movement under Alexander III (1983, Harvard University Press), The Russians in Germany: The History of The Soviet Zone of Occupation, 1945–1949 (1995, Harvard University Press), The Establishment of Communist Regimes in Eastern Europe (1998, Westview Press), Fires of Hatred: Ethnic Cleansing In 20th Century Europe (2001, Harvard University Press), Stalin's Genocides (2010, Princeton University Press), and Genocide: A World History (2016, Oxford University Press). Naimark’s latest book, Stalin and the Fate of Europe: The Postwar Struggle for Sovereignty (Harvard 2019), explores seven case studies that illuminate Soviet policy in Europe and European attempts to build new, independent countries after World War II.

 

Affiliated faculty at The Europe Center
Affiliated faculty at the Center on Democracy, Development and the Rule of Law
Norman M. Naimark Robert and Florence McDonnell Professor of East European Studies, Department of History. By courtesy: Senior Fellow, Freeman Spogli Institute, Senior Fellow Hoover Institution, Professor German Studies. Speaker
Lectures
Subscribe to The Americas