Technology: Activists warn UN about dangers of using AI to make decisions on what human soldiers should target and destroy on the battlefield - Sweden: 300 neo-Nazis protest in Stockholm - PressFrom - US
  •   
  •   
  •   

Technology Activists warn UN about dangers of using AI to make decisions on what human soldiers should target and destroy on the battlefield

12:20  23 october  2019
12:20  23 october  2019 Source:   dailymail.co.uk

DICE cancels 'Battlefield V' close combat mode to help its focus

DICE cancels 'Battlefield V' close combat mode to help its focus DICE has been piling features on Battlefield V ever since launch. Now, however, it's dialing things back. The developers have scrapped the planned five-on-five close combat mode after determining that it was needed to hit "bug-crushing and content goals" in a timely fashion, as Senior Producer Ryan McArthur put it. The team can not only "better deliver" on promises of material it knows people want, but offer "new ways" to enjoy current modes. It's not certain why this mode got the cut where other changes didn't, but it may be a simple matter of concentrating on what BFV does best.

AI to make decisions on what human soldiers should target and destroy on the battlefield Activists warn UN about killer robots in the military at panel discussion at UN Said it is unethical, immoral and a decision that cannot be undone fully autonomous weapons and thereby retain meaningful human control over the use of force.

Jody Williams made the statement at the United Nations in New York City after the US military announced its project the uses AI to make decisions on what human soldiers should target and destroy . Williams also pointed out the difficulty of holding those involved accountable for certain war

a group of people posing for the camera© Provided by Associated Newspapers Limited A Nobel Peace prize winner has warned against robots making life-and-death decision on the battlefield, as it is 'unethical and immoral' and can never be undone.

Jody Williams made the statement at the United Nations in New York City after the US military announced its project the uses AI to make decisions on what human soldiers should target and destroy.

Williams also pointed out the difficulty of holding those involved accountable for certain war crimes, as there will be a programmer, manufacturer, commander and the machine itself involved in the act.

UK, Germany, France warn on South China Sea tensions

UK, Germany, France warn on South China Sea tensions Britain, Germany and France said on Thursday they were concerned by tensions in the South China Sea, in a statement issued the day after a U.S. Navy destroyer sailed near islands claimed by China. © Reuters/Erik de Castro FILE PHOTO: Chinese structures are pictured in Subi Reef at disputed South China Sea The situation there "could lead to insecurity and instability in the region," the three countries said in a joint statement issued by Britain's foreign ministry.

Activists warn UN about dangers of using AI to make decisions on what human soldiers should target and destroy on the battlefield . Pete Buttigieg's own focus group warned black voters find his sexuality a 'barrier' to supporting 2020 Democrat with one saying: 'How can you refer to God when a

Others made light of her sentence, joking: 'Felicity Huffman is now in jail and I gotta admit, the streets feel a little safer.' Comedian Jenny Johnson wrote: 'Felicity Huffman began her 14-day prison sentence this morning. I hope she has taken all the proper steps to be reintroduced to society once her time is up.

Williams won the prestigious accolade in 1997 after leading efforts to ban landmines and is now an advocate with the 'Campaign To Stop Killer Robots'.

'Drones started out, you know, as surveillance equipment, and then suddenly they stuck on some Hellfire missiles, and they were, you know, killer,' she said during a panel discussion at the United Nations in New York City yesterday.

'We're hoping, and really expecting that the larger community would not find out about the research and development of killer robots.'

'We need to step back and think about how artificial intelligence robotic weapons systems would affect this planet and the people living on it.'

a plane flying in the air: The activists against killer robots have also pleaded with officials to draft regulations for any craft heading into battle, whether by land, sea or land, without human intervention. The MQ-9 Reaper is set to incorporate AI for making decisions in the battlefield© Provided by Associated Newspapers Limited The activists against killer robots have also pleaded with officials to draft regulations for any craft heading into battle, whether by land, sea or land, without human intervention. The MQ-9 Reaper is set to incorporate AI for making decisions in the battlefield

Williams is referring to the US military's new initiative, Project Quarterback, which is using AI to make split-second decisions on how to carry out attacks in the field.

Gmail will now warn users when an email recipient is out of the office

Gmail will now warn users when an email recipient is out of the office On Thursday, Google published a post to its G Suite blog announcing that Gmail and Hangouts Chat will warn users when the recipient of their message is on vacation or away. To prevent Gmail users from receiving an onslaught of emails while they're on vacation, Google announced yesterday on its G Suite blog that the platform will warn senders when the receiver of their email or Hangouts Chat message is away. Users will first need to note which dates they will be out of the office on their Google Calendar; those they have shared the calendar with will then receive a warning if they attempt to send an email their way.

Activists warn UN about dangers of using AI to make decisions on what human soldiers should target and destroy on the battlefield . Alexander Komarov, 37, used a lighter to start the nighttime blaze after his son claimed a boy who lived in this block had stolen 70 roubles (85 pence) from him

Activists warn UN about dangers of using AI to make decisions on what human soldiers should target and destroy on the battlefield . Chinese foreign ministry spokeswoman Hua Chunying claimed Western powers were using notions of democracy and human rights as a 'pretentious cover'

The group was formed in October 2012 with the goal to ban fully autonomous weapons and thereby retain meaningful human control over the use of force.

The activists against killer robots have also pleaded with officials to draft regulations for any craft heading into battle, whether by land, sea or land, without human intervention.

Liz O'Sullivan, of the International Committee for Robot Arms Control, said, 'if we allow autonomous weapons to deploy and selectively engage with their own targets, we will see disproportionate false fatalities and error rates with people of color, people with disabilities, anybody who has been excluded from the training sets by virtue of the builders own inherent bias.'

Mary Wareham, another activist, pointed out that during meetings at the UN in Geneva in August, 'Russia and the United States were the key problems' as they 'did not want to see any result' towards the drafting of a ban treaty.

We want you: Apply to Startup Battlefield at Disrupt Berlin 2019

We want you: Apply to Startup Battlefield at Disrupt Berlin 2019 You’ve worked hard to build your dream to this point, and now it’s time to launch your early-stage startup on a world-class stage and shift your momentum into high gear. If that description fits, we want you to apply to compete in the Startup Battlefield at Disrupt Berlin 2019. Our early-stage startup pitch competition is the most effective way to place your startup in front of the investors, tech leaders and media outlets that can change the trajectory of your business in the best way possible. Oh, and the winning founders also receive $50,000. Sweet! What’s more, applying to and participating in Startup Battlefield is absolutely free — no fees, no equity, no nothing.

Activists warn UN about dangers of using AI to make decisions on what human soldiers should target and destroy on the battlefield . Kim Kardashian throws up in a toilet after getting drunk as sister Khoe holds her hair back in throwback video posted on the star's 39th birthday.

Activists warn UN about dangers of using AI to make decisions on what human soldiers should target and destroy on the battlefield . It requires the government to pass all the legislation needed to enact Brexit before MPs finally sign off on the terms of the agreement.

She said, 'other countries that are investing heavily into ever increasingly autonomous weapon systems include China, South Korea, Israel, the United Kingdom to some extent; perhaps Turkey, perhaps Iran.'

Another dangerous factor that comes into play with killer robots is who or what will be held accountable for war crimes?

'It's unclear who, if anyone, could be held responsible for unlawful acts caused by a fully autonomous weapon: the programmer, manufacturer, commander, [or the] machine itself,' said Williams.

'This accountability gap would make it is difficult to ensure justice, especially for victims.'

Read more

The U.S. Army Wants New High-Tech Goggles With AR, VR, and Night Vision .
A single goggle to do it all—in peace and war. The U.S. Army already fields tens of thousands of pairs of night vision goggles.The service is also pushing into using virtual reality for training.A single pair of goggles that could do night vision and virtual reality training would streamline things, ensuring the troops only needed one pair.The U.S. Army wants a single pair of goggles for the troops, one that allows soldiers to conduct realistic training during the day and hunt their enemies at night.

—   Share news in the SOC. Networks

Topical videos:

usr: 0
This is interesting!