·Îº¿½Å¹®»ç
Ȩ > ´º½º > ÀΰøÁö´É
Boycott of Korea¡¯s KAIST over ¡®killer robots¡¯ ends
ÆùƮŰ¿ì±â ÆùÆ®ÁÙÀ̱â ÇÁ¸°Æ®Çϱ⠸ÞÀϺ¸³»±â ½Å°íÇϱâ
½ÂÀÎ 2018.04.10  11:16:50
Æ®À§ÅÍ Ä«Ä«¿ÀÅå ÆäÀ̽ººÏ

A boycott by leading AI & robotics researchers of South Korea’s KAIST has been called off after the university’s president committed not to develop lethal autonomous weapons.
More than 50 of the world’s leading artificial intelligence (AI) and robotics researchers from 30 different countries have declared they would end a boycott of the Korea Advanced Institute of Science and Technology (KAIST), South Korea’s top university, over the opening of an AI weapons lab in collaboration with Hanwha Systems, a major arms company.

At the opening of the new laboratory, the Research Centre for the Convergence of National Defence and Artificial Intelligence, it was reported that KAIST was “joining the global competition to develop autonomous arms” by developing weapons “which would search for and eliminate targets without human control”. Further cause for concern was that KAIST’s industry partner, Hanwha Systems builds cluster munitions, despite an UN ban, as well as a fully autonomous weapon, the SGR-A1 Sentry Robot. In 2008, Norway excluded Hanwha from its $380 billion future fund on ethical grounds.

KAIST’s President, Professor Sung-Chul Shin, responded to the boycott by affirming in a statement that “KAIST does not have any intention to engage in development of lethal autonomous weapons systems and killer robots”. He went further by committing that “KAIST will not conduct any research activities counter to human dignity including autonomous weapons lacking meaningful human control.”

Given this swift and clear commitment to the responsible use of artificial intelligence in the development of weapons, the 56 AI and robotics researchers who were signatories to the boycott have rescinded the action. They will once again visit and host researchers from KAIST, and collaborate on scientific projects.

Toby Walsh, Scientia Professor of Artificial Intelligence at the University of New South Wales in Sydney, who initiated the action, praised KAIST for the rapid response. “I was very pleased that the president of KAIST has agreed not to develop lethal autonomous weapons, and to follow international norms by ensuring meaningful human control of any AI-based weapon that will be developed,” he said. “I applaud KAIST for doing the right thing, and I’ll be happy to work with KAIST in the future.

“It goes to show the power of the scientific community when we choose to speak out – our action was an overnight success,” he added. “We initially sought assurances in private from the university more than month ago about the goals of their new lab. But the day after we announced the boycott, KAIST gave assurances very publicly and very clearly.”

“There are plenty of good applications for AI, even in a military setting. No one, for instance, should risk a life or limb clearing a minefield – this is a perfect job for a robot. But we should not, however, hand over the decision of who lives or who dies to a machine – this crosses an ethical red-line and will result in new weapons of mass destruction.”

The boycott arose ahead of meetings this week in Geneva of the 123 member nations of the United Nations discussing the challenges posed by lethal autonomous weapons (often called ‘killer robots’), known as the Group of Governmental Experts to the Convention on Certain Conventional Weapons, it will consider military applications of AI, and options for addressing the humanitarian and international security challenges posed by lethal autonomous weapons systems. Already, 22 of nations taking part have called for an outright and pre-emptive ban on such weapons.

“Back in 2015, thousands of my colleagues in AI wrote an open letter to the UN warning of an arms race to develop autonomous weapons,” Walsh said. “We couldn’t therefore sit back and watch a top university collaborate with such a controversial industry partner to accelerate that race.

Á¤¿ø¿µ  robot3@irobotnews.com
ÀÌ ±â»ç¿¡ ´ëÇÑ ´ñ±Û À̾߱â (0)
ÀÚµ¿µî·Ï¹æÁö¿ë Äڵ带 ÀÔ·ÂÇϼ¼¿ä!   
È®ÀÎ
- 200ÀÚ±îÁö ¾²½Ç ¼ö ÀÖ½À´Ï´Ù. (ÇöÀç 0 byte / ÃÖ´ë 400byte)
- ¿å¼³µî ÀνŰø°Ý¼º ±ÛÀº »èÁ¦ ÇÕ´Ï´Ù. [¿î¿µ¿øÄ¢]
ÀÌ ±â»ç¿¡ ´ëÇÑ ´ñ±Û À̾߱â (0)
Á¤¿ø¿µÀÇ ´Ù¸¥±â»ç º¸±â  
ÆùƮŰ¿ì±â ÆùÆ®ÁÙÀ̱â ÇÁ¸°Æ®Çϱ⠸ÞÀϺ¸³»±â ½Å°íÇϱâ
Æ®À§ÅÍ ÆäÀ̽ººÏ ±¸±Û+ ¹êµå µÚ·Î°¡±â À§·Î°¡±â
Àαâ±â»ç
1
·Îº¿·£µå Å׸¶ÆÄÅ©, 4¿ù 3ÀϺÎÅÍ 4ÀϱîÁö Àç °³ÀåÀü ¹«·á ÀÔÀå
2
ÇÁ¸®´º-¿¡¾îºô¸®Æ¼, ±º¿ë ¹«ÀÎÇ×°ø±â ½Ã½ºÅÛ °³¹ß ¾÷¹«Çù¾à
3
¿À´ÃÀÇ ·Îº¿±â¾÷ ÁֽĽü¼(2024-03-22)
4
ÄÚ¸®¾Æ¾¾ÀÌ¿À¼­¹Ô, 'Çѱ¹ ¼±ÁøÈ­¸¦ À§ÇÑ µµÀü°ú ÀÀÀü', 'AI, ·Îº¿ÀÌ À̲ô´Â ¹Ì·¡»ê¾÷' ÁÖÁ¦ÀÇ °­¿¬È¸ ´ë¼ºÈ²
5
´ë±¸-±¤ÁÖ ´Þºûµ¿¸Í ¼Ò¡¤ºÎ¡¤Àå Çù·Â ±â¼ú¼¼¹Ì³ª °³ÃÖ
6
¾Ë¿¡½º¿ÀÅä¸ÞÀ̼Ç, ½´³ªÀÌ´õ ÀÏ·ºÆ®¸¯°ú ·Îº¿ ½Ã½ºÅÛ ºÐ¾ß Àü·«Àû Á¦ÈÞ
7
ÃÊÀÚµ¿È­ ½Ã´ë°¡ ¿Â´Ù
8
¿£Á©·Îº¸Æ½½º, ÄÚ½º´Ú ½ÃÀå¿¡ ¼º°øÀûÀ¸·Î 'ÀÔ¼º'
9
¸±¶óÀ̾îºí ·Îº¸Æ½½º, Ú¸ °ø±º Æݵù ÇÁ·Î±×·¥¿¡ ¼±Á¤µÅ
10
Ô¼ ¸¶ÀÌÅ©·Ó½Ã, AI ºñÀüSW ¡®¹Ì¶óÀÌ2¡¯ ¹ßÇ¥
·Îº¿½Å¹® ¼Ò°³¤ý±â»çÁ¦º¸¤ý±¤°í¹®ÀǤýºÒÆí½Å°í¤ý°³ÀÎÁ¤º¸Ãë±Þ¹æħ¤ýÀ̸ÞÀϹ«´Ü¼öÁý°ÅºÎ¤ýû¼Ò³âº¸È£Á¤Ã¥    £ª±¹Á¦Ç¥ÁØ°£Ç๰¹øÈ£ ISSN 2636-0381 £ªº»Áö´Â ÀÎÅͳݽŹ®À§¿øȸ ÀÚÀ²½ÉÀÇ Áؼö ¼­¾à»çÀÔ´Ï´Ù
08298) ¼­¿ï ±¸·Î±¸ °ø¿ø·Î 41(±¸·Îµ¿, Çö´ëÆÄÅ©ºô 526È£)  |  ´ëÇ¥ÀüÈ­ : 02)867-6200  |  Æѽº : 02)867-6203
µî·Ï¹øÈ£ : ¼­¿ï ¾Æ 02659  |  µî·ÏÀÏÀÚ : 2013.5.21  |  ¹ßÇàÀΡ¤ÆíÁýÀÎ : Á¶±Ô³²  |  Ã»¼Ò³âº¸È£Ã¥ÀÓÀÚ : ¹Ú°æÀÏ
Copyright © 2013 ·Îº¿½Å¹®»ç. All rights reserved. mail to editor@irobotnews.com