Information Technology
Paragraphs

The Computer Fraud and Abuse Act (CFAA) provides a civil cause of action for computer hacking victims that have suffered certain types of harm. Of these harms, the one most commonly invoked by plaintiffs is having suffered $5,000 or more of cognizable “loss” as defined by the statute. In its first-ever CFAA case, 2021’s Van Buren v. United States, the Supreme Court included intriguing language that “loss” in civil cases should be limited to “technological harms” constituting “the typical consequences of hacking.” To date, lower courts have only followed the Court’s interpretation if their circuit already interpreted “loss” narrowly pre-Van Buren and have continued to approach “loss” broadly otherwise.

Van Buren did not fully dissipate the legal risks the CFAA has long posed to a particular community: people who engage in good-faith cybersecurity research. Discovering and reporting security vulnerabilities in software and hardware risks legal action from vendors displeased with unflattering revelations about their products’ flaws. Research activities have even led to criminal investigations at times. Although Van Buren narrowed the CFAA’s scope and prompted reforms in federal criminal charging policy, researchers continue to face some legal exposure. The CFAA still lets litigious vendors “shoot the messenger” by suing over security research that did them no harm. Spending just $5,000 addressing a vulnerability is sufficient to allow the vendor to sue the researcher who reported it, because such remediation costs qualify as “loss” even in courts that read that term narrowly.

To mitigate the CFAA’s legal risk to researchers, a common proposal is a statutory safe harbor for security research. Such proposals walk a fine line between being unduly byzantine for good-faith actors to follow and lax enough to invite abuse by malicious actors. Instead of the safe harbor approach, this article recommends a simpler way to reduce litigation over harmless research: follow the money.

The Article proposes (1) amending the CFAA’s “loss” definition to prevent vulnerability remediation costs alone from satisfying the $5,000 standing threshold absent any other alleged loss, and (2) adding a fee-shifting provision that can be invoked where plaintiffs’ losses do not meet that threshold. Tightening up the “loss” calculus would disqualify retaliatory litigation against beneficial (or at least benign) security research while preserving victims’ ability to seek redress where well-intended research activities do cause harm. Fee-shifting would deter weak CFAA claims and give the recipients of legal threats some leverage to fight back. Coupled with the Van Buren decision, these changes would reach beyond the context of vendor versus researcher: they would help rein in the CFAA’s rampant misuse over behavior far afield from the law’s core anti-hacking purpose.

All Publications button
1
Publication Type
Journal Articles
Publication Date
Journal Publisher
Richmond Journal of Law & Technology
Authors
Riana Pfefferkorn
Number
1
-
blue background with text overlay that reads uncommon yet consequential online harms

Come join The Journal of Online Trust & Safety, an open access journal for cutting-edge trust and safety scholarship, as we bring together authors published in our special issue, Uncommon yet Consequential Online Harms, for a webinar, hosted on September 1, 9:30-10:30am PT. 

The Journal of Online Trust & Safety publishes research from computer science, sociology, political science, law, and more. Journal articles have been covered in The Guardian, The Washington Post, and Platformer and cited in Senate testimony and a platform policy announcement.

Articles in this special issue will include: 

Election Fraud, YouTube, and Public Perception of the Legitimacy of President Biden by James Bisbee, Megan A. Brown, Angela Lai, Richard Bonneau, Joshua A. Tucker, and Jonathan Nagler

Predictors of Radical Intentions among Incels: A Survey of 54 Self-identified Incels by Sophia Moskalenko, Naama Kates, Juncal Fernández-Garayzábal González, and Mia Bloom

Procedural Justice and Self Governance on Twitter: Unpacking the Experience of Rule Breaking on Twitter by Matthew Katsaros, Tom Tyler, Jisu Kim, and Tracey Meares

Twitter’s Disputed Tags May Be Ineffective at Reducing Belief in Fake News and Only Reduce Intentions to Share Fake News Among Democrats and Independents by Jeffrey Lees, Abigail McCarter, and Dawn M. Sarno

To hear from the authors about their new research, please register for the webinar. To be notified about journal updates, please sign up for Stanford Internet Observatory announcements and follow @journalsafetech. Questions about the journal can be sent to trustandsafetyjournal@stanford.edu.

 

 

Panel Discussions
0
CDDRL/HAI Predoctoral Scholar, 2022-2023

Eddie Yang is a PhD candidate in the Department of Political Science at UC San Diego. His research focuses on repression and the politics of Artificial Intelligence. His dissertation studies how existing repressive institutions limit the usefulness of AI for authoritarian control, with a focus on China. His work has been published in both computer science and political science. 

-
abstract blue image with text Trust and Safety Research Conference

Join us September 29-30 for two days of cross-professional presentations and conversations designed to push forward research on trust and safety.

Hosted at Stanford University’s Frances. C. Arrillaga Alumni Center, the Trust and Safety Research Conference will convene trust and safety practitioners, people in government and civil society, and academics in fields like computer science, sociology, law, and political science to think deeply about trust and safety issues.

Your ticket gives you access to:

  • Two days of talks, panels, workshops, and breakouts
  • Networking opportunities, including happy hours on September 28, 29 and 30th.
  • Breakfast and lunch on September 29 and 30th.

Early bird tickets are $100 for attendees from academia and civil society and $500 for attendees from industry. Ticket prices go up August 1, 2022. Full refunds or substitutions will be honored until August 15, 2022. After August 15, 2022 no refunds will be allowed.

For questions, please contact us through internetobservatory@stanford.edu

Frances C. Arrillaga Alumni Center
326 Galvez Street
Stanford, CA 94305

Conferences
-
Image
Big Data China logo

The event will be webcast live from this page.

The Stanford Center on China’s Economy and Institutions (SCCEI) and the CSIS Trustee Chair in Chinese Business and Economics launch the third feature of the new collaboration, Big Data China, on July 27 at 9 a.m. PDT / 12 p.m. EDT.  The feature, “The AI-Surveillance Symbiosis in China,” highlights the work of professors Noam Yuchtman (London School of Economics) and David Yang (Harvard University) and their colleagues. The feature shows how China‘s large-scale investments in surveillance technology is both enhancing the state‘s capacity to repress dissent and providing commercial advantage to Chinese AI companies operating in the facial recognition and surveillance space.


The event will feature a presentation of the key findings of the analysis and its implications for the Washington policy community by Noam Yuchtman of London School of Economics, David Yang of Harvard University, and Trustee Chair Fellow Ilaria Mazzocco. Trustee Chair Director Scott Kennedy will moderate a panel discussion which will include questions from the public and questions from the audience. The distinguished panelists for the event are: Emily Weinstein of CSET, Paul Mozur of the New York Times, and Trustee Chair non-resident senior associate Paul Triolo


WATCH THE RECORDING

FEATURING

Noam Yuchtman 
Professor of Managerial Economics and Strategy, London School of Economics and Political Science
David Yang 
Assistant Professor of Economics, Harvard University
Emily Weinstein  
Research Fellow, Center for Security and Emerging Technology (CSET), Georgetown University
Paul Mozur 
Correspondent, New York Times
Scott Kennedy 
Senior Adviser and Trustee Chair in Chinese Business and Economics
Ilaria Mazzocco 
Fellow, Trustee Chair in Chinese Business and Economics
Paul Triolo 
Senior Associate (Non-resident), Trustee Chair in Chinese Business and Economics
 

EVENT PARTNERS
 

Image
SCCEI and CSIS logos

Virtual Livestream 

Scott Kennedy
Ilaria Mazzocco
Paul Mozur
Paul Triolo
Emily Weinstein
David Yang
Noam Yuchtman
Panel Discussions
-
Image
3D mockup cover of APARC's volume 'South Korea's Democracy in Crisis'

South Korea's Democracy in Crisis: The Threats of Illiberalism, Populism, and Polarization 
위기의 한국 민주주의: 비자유주의, 포퓰리즘, 양극화의 위협

In this book launch event held in Korea, the participants will examine and discuss the threats to democracy in Korea. For more information about the book, please visit the publication webpage.

<The book launch event will be in Korean>

14:00-14:05 Introduction by Ho-Ki Kim, Professor of Sociology, Yonsei University

Moderated by Dukjin Chang, Professor of Sociology, Seoul National University

14:05-15:20 Presentations

Democracy in Crisis: Populism in Post-Truth Era
Gi-Wook Shin, Director of Shorenstein Asia-Pacific Research Center, Stanford University
Ho-Ki Kim, Professor of Sociology, Yonsei University

Two divergences in South Korea’s Economy and Disparities in Democracy
Jun-Ho Jeong, Professor of Economics, Kangwon University
Il-Young Lee, Professor of Economics, Hanshin University

Judicialization of Politics and Politicization of  the Judiciary in Korea : Challenges in Maintaining the Balance of Power
Seongwook Heo, Professor of Law, Seoul National University

15:20-15:40 Break

15:40-16:55 Panel Discussion

Won-Taek Kang, Professor of Political Science, Seoul National University
Seeun Jeong, Professor of Economics, Chungnam National University
Chulwoo Lee, Professor of Law, Yonsei University

16:55-17:00 Closing Remarks by Gi-Wook Shin, Director of Shorenstein Asia-Pacific Research Center, Stanford University

This event is made possible by generous support from the Korea Foundation and other friends of the Korea Program.

In-Person event in Korea
June 14, 2PM-5PM, Korea Time
Press Center, Seoul

Seminars
Authors
Melissa De Witte, Taylor Kubota, Ker Than
Taylor Kubota
Ker Than
News Type
News
Date
Paragraphs

During a speech at Stanford University on Thursday, April 21, 2022, former U.S. President Barack Obama presented his audience with a stark choice: “Do we allow our democracy to wither, or do we make it better?”

Over the course of an hour-long address, Obama outlined the threat that disinformation online, including deepfake technology powered by AI, poses to democracy as well as ways he thought the problems might be addressed in the United States and abroad.

“This is an opportunity, it’s a chance that we should welcome for governments to take on a big important problem and prove that democracy and innovation can coexist,” Obama said.

Obama, who served as the 44th president of the United States from 2009 to 2017, was the keynote speaker at a one-day symposium, titled “Challenges to Democracy in the Digital Information Realm,” co-hosted by the Stanford Cyber Policy Center and the Obama Foundation on the Stanford campus on April 21.

The event brought together people working in technology, policy, and academia for panel discussions on topics ranging from the role of government in establishing online trust, the relationship between democracy and tech companies, and the threat of digital authoritarians.

Obama told a packed audience of more than 600 people in CEMEX auditorium – as well as more than 250,000 viewers tuning in online – that everyone is part of the solution to make democracy stronger in the digital age and that all of us – from technology companies and their employees to students and ordinary citizens – must work together to adapt old institutions and values to a new era of information. “If we do nothing, I’m convinced the trends that we’re seeing will get worse,” he said.

Introducing the former president was Michael McFaul, director at the Freeman Spogli Institute for International Studies and U.S. ambassador to Russia under Obama, and Stanford alum and Obama Foundation fellow, Tiana Epps-Johnson, BA ’08.

Epps-Johnson, who is the founder and executive director of the Center for Tech and Civic Life, recalled her time answering calls to an election protection hotline during the 2006 midterm election. She said the experience taught her an important lesson, which was that “the overall health of our democracy, whether we have a voting process that is fair and trustworthy, is more important than any one election outcome.”

Stanford freshman Evan Jackson said afterward that Obama’s speech resonated with him. “I use social media a lot, every day, and I’m always seeing all the fake news that can be spread easily. And I do understand that when you have controversy attached to what you’re saying, it can reach larger crowds,” Jackson said. “So if we do find a way to better contain the controversy and the fake news, it can definitely help our democracy stay powerful for our nation.”

The Promise and Perils Technology Poses to Democracy


In his keynote, Obama reflected on how technology has transformed the way people create and consume media. Digital and social media companies have upended traditional media – from local newspapers to broadcast television, as well as the role these outlets played in society at large.

During the 1960s and 1970s, the American public tuned in to one of three major networks, and while media from those earlier eras had their own set of problems – such as excluding women and people of color – they did provide people with a shared culture, Obama said.

Moreover, these media institutions, with established journalistic best practices for accuracy and accountability, also provided people with similar information: “When it came to the news, at least, citizens across the political spectrum tended to operate using a shared set of facts – what they saw or what they heard from Walter Cronkite or David Brinkley.”

Fast forward to today, where everyone has access to individualized news feeds that are fed by algorithms that reward the loudest and angriest voices (and which technology companies profit from). “You have the sheer proliferation of content, and the splintering of information and audiences,” Obama observed. “That’s made democracy more complicated.”

Facts are competing with opinions, conspiracy theories, and fiction. “For more and more of us, search and social media platforms aren’t just our window into the internet. They serve as our primary source of news and information,” Obama said. “No one tells us that the window is blurred, subject to unseen distortions, and subtle manipulations.”

The splintering of news sources has also made all of us more prone to what psychologists call “confirmation bias,” Obama said. “Inside our personal information bubbles, our assumptions, our blind spots, our prejudices aren’t challenged, they are reinforced and naturally, we’re more likely to react negatively to those consuming different facts and opinions – all of which deepens existing racial and religious and cultural divides.”

But the problem is not just that our brains can’t keep up with the growing amount of information online, Obama argued. “They’re also the result of very specific choices made by the companies that have come to dominate the internet generally, and social media platforms in particular.”

The former president also made clear that he did not think technology was to blame for many of our social ills. Racism, sexism, and misogyny, all predate the internet, but technology has helped amplify them.

“Solving the disinformation problem won’t cure all that ails our democracies or tears at the fabric of our world, but it can help tamp down divisions and let us rebuild the trust and solidarity needed to make our democracy stronger,” Obama said.

He gave examples of how social media has fueled violence and extremism around the world. For example, leaders from countries such as Russia to China, Hungary, the Philippines, and Brazil have harnessed social media platforms to manipulate their populations. “Autocrats like Putin have used these platforms as a strategic weapon against democratic countries that they consider a threat,” Obama said.

He also called out emerging technologies such as AI for their potential to sow further discord online. “I’ve already seen demonstrations of deep fake technology that show what looks like me on a screen, saying stuff I did not say. It’s a strange experience people,” Obama said. “Without some standards, implications of this technology – for our elections, for our legal system, for our democracy, for rules of evidence, for our entire social order – are frightening and profound.”

‘Regulation Has to Be Part of the Answer’


Obama discussed potential solutions for addressing some of the problems he viewed as contributing to a backsliding of democracy in the second half of his talk.

In an apt metaphor for a speech delivered in Silicon Valley, Obama compared the U.S. Constitution to software for running society. It had “a really innovative design,” Obama said, but also significant bugs. “Slavery. You can discriminate against entire classes of people. Women couldn’t vote. Even white men without property couldn’t vote, couldn’t participate, weren’t part of ‘We the People.’”

The amendments to the Constitution were akin to software patches, the former president said, that allowed us to “continue to perfect our union.”

Similarly, governments and technology companies should be willing to introduce changes aimed at improving civil discourse online and reducing the amount of disinformation on the internet, Obama said.

“The internet is a tool. Social media is a tool. At the end of the day, tools don’t control us. We control them. And we can remake them. It’s up to each of us to decide what we value and then use the tools we’ve been given to advance those values,” he said.

The former president put forth various solutions for combating online disinformation, including regulation, which many tech companies fiercely oppose.

“Here in the United States, we have a long history of regulating new technologies in the name of public safety, from cars and airplanes to prescription drugs to appliances,” Obama said. “And while companies initially always complain that the rules are going to stifle innovation and destroy the industry, the truth is that a good regulatory environment usually ends up spurring innovation, because it raises the bar on safety and quality. And it turns out that innovation can meet that higher bar.”

In particular, Obama urged policymakers to rethink Section 230, enacted as part of the United States Communications Decency Act in 1996, which ​​stipulates that generally, online platforms cannot be held liable for content that other people post on their website.

But technology has changed dramatically over the past two decades since Section 230 was enacted, Obama said. “These platforms are not like the old phone company.”

He added: “In some cases, industry standards may replace or substitute for regulation, but regulation has to be part of the answer.”

Obama also urged technology companies to be more transparent in how they operate and “at minimum” should share with researchers and regulators how some of their products and services are designed so there is some accountability.

The responsibility also lies with ordinary citizens, the former president said. “We have to take it upon ourselves to become better consumers of news – looking at sources, thinking before we share, and teaching our kids to become critical thinkers who know how to evaluate sources and separate opinion from fact.”

Obama warned that if the U.S. does not act on these issues, it risks being eclipsed in this arena by other countries. “As the world’s leading democracy, we have to set a better example. We should be able to lead on these discussions internationally, not [be] in the rear. Right now, Europe is forging ahead with some of the most sweeping legislation in years to regulate the abuses that are seen in big tech companies,” Obama said. “Their approach may not be exactly right for the United States, but it points to the need for us to coordinate with other democracies. We need to find our voice in this global conversation.”

 

Transcript of President Obama's Keynote

Read More

Larry Diamond speaking in the Bechtel Conference Center in Encina Hall
Commentary

"We Have Entered a New Historical Era": Larry Diamond on the Future of Democracy

Speaking at the April 2022 meeting of the FSI Council, Larry Diamond offered his assessment of the present dangers to global democracy and the need to take decisive action in support of liberal values.
cover link "We Have Entered a New Historical Era": Larry Diamond on the Future of Democracy
Fake or Fact news on coronavirus
Q&As

Does Free Speech Protect COVID-19 Vaccine Misinformation?

While some might say making or spreading known false statements related to the COVID-19 vaccine should be criminalized, the First Amendment, which guarantees free speech, continues to provide protection for people who promulgate such faulty information. So, how can the spread of misinformation be stopped without quashing free speech?
cover link Does Free Speech Protect COVID-19 Vaccine Misinformation?
Image of social media icons and a hand holding a phone
Blogs

Full-Spectrum Pro-Kremlin Online Propaganda about Ukraine

Narratives from overt propaganda, unattributed Telegram channels, and inauthentic social media accounts
cover link Full-Spectrum Pro-Kremlin Online Propaganda about Ukraine
All News button
1
Subtitle

At a conference hosted by the Cyber Policy Center and Obama Foundation, former U.S. President Barack Obama delivered the keynote address about how information is created and consumed, and the threat that disinformation poses to democracy.

Date Label
Authors
Melissa Morgan
News Type
News
Date
Paragraphs

In a memo from March 2021, Secretary of Defense Lloyd Austin outlined new mandates for the Department of Defense to modernize, encourage innovation and “invest smartly for the future” in order to meet the dynamic threat landscape of the modern world. Writing in the same memo, he acknowledged that this goal cannot be met without the cooperation of stakeholders from across the board, including private industries and academic institutions.

In keeping with that priority, on April 5, 2022, Deputy Secretary of Defense Kathleen Hicks and her team joined a cross-departmental roundtable of faculty and students to hear more about Stanford's efforts to bring Silicon Valley-style innovation to projects at the Department of Defense and its interagencies.

These students are working under the umbrella of the Gordian Knot Center for National Security Innovation (GKC), a new program at the Center for International Cooperation and Security (CISAC) at the Freeman Spogli Institute for International Studies (FSI). GKC aims to coordinate resources at Stanford, peer universities, and across Silicon Valley’s innovation ecosystem in order to provide cutting-edge national security education and train national security innovators.


This is a great place to be doing this. Here in Silicon Valley, there’s a huge amount of opportunity and ecosystem available across both Stanford and the broader research community and commercial sector.
Kathleen Hicks
Deputy Secretary of Defense

At the core of GKC is a series of classes and initiatives that combine STEM skills with policy know-how in a way that’s meant to encourage students to leverage entrepreneurship and innovation in order to develop rapid, scalable solutions to national security issues. Students from both undergraduate and graduate level programs, regardless of their prior experience in national defense, are encouraged to participate.

“We’re really trying to empower students to pursue national security-relevant work while they’re here at Stanford,” explains Joe Felter, GKC’s director, co-founder, and senior research scholar at CISAC. FSI and CISAC have deep roots in this type of innovative, interdisciplinary approach to policy solutions GKC is working to implement. Michael McFaul, FSI’s director, is a founding faculty member and principal investigator for GKC, and David Hoyt, the assistant director of GKC, is an alumnus of the CISAC honors program.

Results from GKC’s classes have been very encouraging so far. Working through "Hacking for Defense," a GKC-affiliated class taught out of the MS&E department, Jeff Jang, a new Defense Innovation Scholar and MBA student, showed how implementing a rapid interview process and focusing on problem and customer discovery has allowed his team to create enterprise software for United States Air Force (USAF) fleet management that has vastly improved efficiency, reduced errors and enabled better planning capabilities into the workflow. Their product has been given numerous grants and awards, and the team has received signed letters of interest from 29 different USAF bases across the world.

In another GKC class, "Technology, Innovation, and Great Power Competition,” Abeer Dahiya and Youngjun Kwak, along with Mikk Raud, Dave Sprague and Miku Yamada — three students from FSI’s Ford Dorsey Master’s in International Policy program (MIP) — have been tackling the challenges involved in developing a domestic U.S. semiconductor strategy. They were among the student teams asked to present the results of their work to Dep. Sec. Hicks during her visit.

“Attending this class has been one of the highlights of my time at Stanford,” says Mikk Raud (MIP ‘22). “It’s been a great example of how important it is to run interdisciplinary courses and bring people from different fields together.”

He continues, “As a policy student, it was very insightful for me to learn from my peers from different programs, as well as make numerous visits to the engineering quad to speak to technical professors whom I otherwise would have never met. After meeting with and presenting to Deputy Secretary Hicks and hearing about the work other students are doing, it really hit home to me that the government does listen to students, and it really is possible that a small Stanford group project can eventually lead into significant changes and improvements of the highest levels of policy making.”

This kind of renewed interest in national security and defense tech among students is precisely what the Gordian Knot Center is hoping to foster. Building an interconnected innovation workforce that can “think deeply, [and] act quickly,” GKC’s motto, is a driving priority for the center and its supporters.


We’re really trying to empower students to pursue national security-relevant work while they’re here at Stanford.
Joe Felter
GKC Director

The Department of Defense recognizes the value of this approach. In her remarks, Dep. Sec. Kathleen Hicks acknowledged that reshaping the culture and methodologies by which the DoD runs is as imperative as it is difficult.

“My life is a Gordian knot, day in and day out at the Defense Department,” she quipped. Speaking seriously, she reminded the audience of the tremendous driving power DoD has had in creating future-looking national security defenses.  “Because of its sophistication, diversity, and capacity to innovate, the U.S. Defense Industrial Base and vibrant innovation ecosystem remains the envy of the world,” Hicks emphasized. “Every day, people like you are designing, building, and producing the critical materials and technologies that ensure our armed forces have what they need.”

But she also recognized that the challenges facing the DoD are real and complex. “There are many barriers in front of the Department of Defense in terms of what it takes to operate in government and to make the kinds of shifts we need in order to have the agility to take advantage of opportunities and partner effectively.” She reiterated that one of her key priorities is to accelerate innovation adoption across DoD, including organizational structure, processes, culture, and people.

Partnerships with groups like the Gordian Knot Center are a key component to breaking down the barriers to innovation facing our national institutions and rebuilding them into new, more adaptable bridges forward. While the challenges facing the Department of Defense remain significant, the work of the students in GKC’s classes so far proves that progress is not only possible, but can be made quickly as well.

Read More

Woman
Q&As

Are We Dumb about Intelligence?

Amy Zegart on the Capabilities of American Intel Gathering
cover link Are We Dumb about Intelligence?
MIP Class of 2022
News

Meet the Ford Dorsey Master’s in International Policy Class of 2022

The new cohort of MIP students kicked off an unusual fall quarter last week. Four of the first-year students describe what attracted them to the program and their hopes for the future.
cover link Meet the Ford Dorsey Master’s in International Policy Class of 2022
All News button
1
Subtitle

A visit from the Department of Defense’s deputy secretary gave the Gordian Knot Center a prime opportunity to showcase how its faculty and students are working to build an innovative workforce that can help solve the nation’s most pressing national security challenges.

-

This event is made possible by generous support from the Korea Foundation and other friends of the Korea Program.

Common sense states that foreign policy rarely becomes an issue in South Korea’s elections. However, given the unusually high anti-China sentiment among the South Korean public today, some view that it may become an “unspoken agenda” that every South Korean voter is cognizant about. As Seoul and Beijing mark their 30th diplomatic anniversary this year, their mutual attraction appears visibly moderated. Is it a temporary setback in the neighboring countries’ relationship? What choices will Kim Jong-un make under strategic competition between the U.S. and China? The panel will examine the factors that will shape and influence the future prospect of the Seoul-Beijing ties and the relationship between North Korea and China.   

Speakers:

Image
portrait of Seong Hyon Lee

Seong-hyon Lee is a Senior Fellow at the George H. W. Bush Foundation for U.S.-China Relations and a visiting scholar at the Fairbank Center for Chinese Studies at Harvard University. His research focuses on contemporary relations between China and South Korea. Lee received a bachelor’s degree from Grinnell College, a master’s degree from Harvard University and a PhD from Tsinghua University.

Image
portrait of Sheen Woo

Sheen Woo, Special Policy Advisor to the South Korean Ambassador in China, joined the Korea Program at Shorenstein APARC as a 2021-22 visiting scholar. He is a specialist in China-North Korea relations with expertise in Chinese aid and sanctions against North Korea. He has worked at and with a variety of organizations including NGOs, start-ups, art centers, and state-run think tanks in Korea and China. While at APARC, his research focus is on the development and changes of China's aid to North Korea. He holds a PhD in Management Science from Tsinghua University.

Gi-Wook Shin, director of APARC and the Korea Program, will moderate the discussion.

Via Zoom. Register at https://bit.ly/3tMDyjo

Panel Discussions
-

This event is part of Shorenstein APARC's winter 2022 webinar series, New Frontiers: Technology, Politics, and Society in the Asia-Pacific.

While North Korea’s nuclear capabilities often make headlines, the DPRK increasingly poses a risk that is more difficult to see, in the form of sophisticated cyber attacks. Neighboring South Korea, one of the most digitized nations in the world, must closely monitor and defend against North Korea’s cyber threat, as attacks can disrupt economic, social, and defense infrastructures. This panel will discuss what kind of cyber threat North Korea poses to South Korea and beyond, how South Korea addresses the North Korean cyber attacks, and what other countries can learn from their response.

Speakers:

portrait of Jenny JunJenny Jun is a PhD Candidate in the Department of Political Science at Columbia University and Nonresident Fellow at the Atlantic Council’s Cyber Statecraft Initiative. Her current research explores the dynamics of coercion in cyberspace. Her broader interests include cyber conflict, North Korea, and security issues in East Asia. Jenny is a co-author of the 2015 Center for Strategic and International Studies (CSIS) report North Korea’s Cyber Operations: Strategy and Responses, published by Rowman & Littlefield. She has presented her work on North Korea’s cyber operations at various panels and has provided multiple government briefings and media interviews on the topic. She received her MA and BS each from the Security Studies Program (SSP) and the School of Foreign Service (SFS) at Georgetown University.

Image
portrait of So Jeong Kim

So Jeong Kim is a principal researcher at the National Security Research Institute where she joined in 2004. She currently leads the cybersecurity policy team and provides recommendations on cybersecurity policy and regulatory issues. She was involved in drafting South Korea’s National Cyber Security Strategy published in April 2019, in the 4thand 5th UN Group of Governmental Experts as an adviser, and in the MERIDIAN process as an advisor and organizer. Her main research area is in national cybersecurity policy, international norm-setting processes, confidence building measures, critical information infrastructure protection, law and regulations, and cybersecurity evaluation development. She received her PhD in Engineering from the Graduate School of Information Security at Korea University in 2005.

Gi-Wook Shin, director of APARC and the Korea Program at Stanford University, will moderate the discussion.

Via Zoom. Register at https://bit.ly/3mXJSQW

Panel Discussions
Subscribe to Information Technology