Tech & Science : Abusing A Robot Won’t Hurt It, But It Could Make You A Crueller Person - - PressFrom - Australia
  •   
  •   

Tech & Science Abusing A Robot Won’t Hurt It, But It Could Make You A Crueller Person

10:35  07 november  2019
10:35  07 november  2019 Source:   gizmodo.com.au

The blockchain solution designed to stop farmers in developing countries being cheated

  The blockchain solution designed to stop farmers in developing countries being cheated National government organisations (NGOs) and other philanthropic organisations spend millions of dollars attempting to help create better access to markets Read More The post The blockchain solution designed to stop farmers in developing countries being cheated appeared first on Stockhead.

I’m going to make you FEEL it for awhile. This can blow the relationship apart—or it can blow it open and create an opportunity for growth and transformation. But positive change can ’ t occur until both partners acknowledge and commit to changing the way they energetically wound one another without

Emotional abuse comes in many forms; at times making a person feel crazy when they really It 's also important to understand what emotional abuse looks like between men and women along with They might call them "crazy." Gaslighting is a cruel form of treatment, and nobody deserves to have

Image: Shutterstock Image: Shutterstock

Set in a dystopian 2019, the sci-fi classic Blade Runner explores how artificial humans could impact our humanity. Harrison Ford’s character experiences powerful emotional and moral effects as he goes about hunting “replicants”.

Now, in the real 2019, the influence of robots on human behaviour is increasingly relevant. Killer military robots and sex robots, for example, might alter attitudes to killing and to women, respectively.

In our research, we explored the potential link between social robots and human character.

Could treating social robots kindly make us kinder people? And could cruelty towards them make us more callous?

Darwin man found guilty of continuously abusing stepdaughters over more than a year

  Darwin man found guilty of continuously abusing stepdaughters over more than a year A Northern Territory man who "continuously" sexually abused his two young stepdaughters will spend at least 12 years in prison after a jury found him guilty of nine child sex offences. The man, who cannot be identified as it would identify his victims, repeatedly raped the younger of the two girls, known as "Child B", and indecently assaulted her older sister "Child A" in Darwin over the course of more than year, until July 2018.Child A was aged 9 during the abuse, while Child B was aged 8 and 9.

I think it might be ok to hurt a robot if physical abuse was one of the things that it was designed I think the answer to this requires a bit more information, specifically: why is the person hurting the Then it 'd be cruel to not hurt them. It might make you stop hurting it too because you 're not getting

"But while it 's far better to kick a four-legged robot than a real dog, most reasonable people find even the idea of such violence inappropriate, as the comments By treating something life-like cruelly you are more likely to treat a living thing that way. If they could feel pain it would be completely different."

Types of social robots

Social robots are designed for companionship, customer service, health care and education. Many are animal-like. Paro, the furry baby seal who has even starred in The Simpsons, is used in aged care facilities. Paro can learn new names, respond to greetings, and “enjoys” being praised and petted.

AIBO is a robot dog that plays, expresses likes and dislikes, and develops a personality. Future robot companions might even be human-animal hybrids or realisations of mythical creatures such as centaurs or dragons.

Some social robots are humanoid, which means they resemble humans. Sophia, modelled on Audrey Hepburn, can recognise faces and hold simple conversations.

Rugby league coach Paul Stephens back in jail for abusing boy before 'horrific' child sex offences

  Rugby league coach Paul Stephens back in jail for abusing boy before 'horrific' child sex offences A former Perth rugby league coach who spent more than 14 years behind bars for "horrendous and abhorrent" child sex offences is jailed for another seven months for molesting a boy 25 years ago. Paul Andrew Stephens, 53, plied the victim with alcohol after inviting him to his home to watch a State of Origin match in the early 1990s.Stephens indecently touched the boy, who was aged either 13 or 14 at the time, after telling him he had to sleep at his home because he was too drunk to leave.

As ATMs made it cheaper to run a branch, banks opened more branches, leading to more tellers overall. What if your ATM could not only give you a hundred bucks, but sell you an adjustable-rate Instead of making it possible to create more wealth with less labor, automation might make it

It was an ugly, smelly death, too, beginning with rattling teeth and ending with a body so rotted out from the inside that its victims could literally be startled to death by a loud noise. And, in a cruel twist, one effect of weight bias is that it actually makes you eat more.

Human behaviour towards robots

It’s hard to predict whether and how robots might change us. Early research in human-computer interaction observed people being polite to computers. More recent research suggests humans may respect a robot’s personal space and trust their judgement.

And there are many examples where robots have pulled at our heartstrings. When Steve, a robot security guard, “drowned” in a Washington fountain, locals created a memorial for it.

Similarly, upset Japanese robot owners held Buddhist funerals for their AIBO dogs when Sony withdrew technical support for AIBO.

Watching people “abuse” robots can also elicit uneasiness.

Some years ago, a military experiment that crippled a six-legged robot was halted for being inhumane. On another occasion, when instructed by researchers to “torture” Pleo the dinosaur robot, participants frequently refused.

'Do you think you've broken it?' The Block's Andy is rushed to the doctors by wife Deb after falling down stairs and injuring himself

  'Do you think you've broken it?' The Block's Andy is rushed to the doctors by wife Deb after falling down stairs and injuring himself The Block's Andy was rushed to the doctors on the show on Tuesday, after injuring himself on site. The comedian was moving furniture when he took a tumble down a small flight of stairs, hurting his leg. Wife Deb was nearby when the incident occurred and could hear Andy screaming in pain, before asking him: 'Do you think you've broken it?' © Provided by Associated Newspapers Limited 'Do you think you've broken it?': The Block's Andy (pictured) was rushed to the doctors by wife Deb (L) on the show on Tuesday, after falling down stairs and injuring himself 'Oh f---,' Andy could be heard saying.

Cruel robot overlords get more out of their human subjects than nice ones. The mere presence of an unkind robot seems to improve our cognitive abilities So a cruel robot overlord might make us better at the Stroop task, but it likely wouldn’ t be effective in every situation. “ It would be wrong and a bit

“ It hurts to see the one you really love loves someone else. But it hurts even harder when you find In the larger world, it frames how people think about you , and it can hurt you in lots of little, subtle I’m hurt . There’s a difference.” “ It hurts the most when the person that made you feel special

Yet the desire to harm robots is also real. One study found some children would, in the absence of their parents, verbally abuse, kick, and punch a service robot in a shopping mall.

Establishing a robot cruelty-kindness link

But don’t we mistakenly attribute feelings to robots?

This is possible, but uncertain. After all, we may pity or despise a character in a book, movie, or video game without believing they actually experience anything.

That said, a link between our treatment of robots and our character need not depend on us truly believing robots have feelings.

In a scene from the comedy series The Good Place, Janet the robot begs human characters not to terminate her. When the humans instinctively withdraw in sympathy, Janet comically reminds them that, as an artificial thing, she cannot feel or die. Thus, Janet implies, their reluctance to terminate her is, despite her own pleas for mercy, irrational.

But is it?

Imagine a non-talking robot which, when threatened or assaulted, struggles, staggers, tries to flee, and petitions other people for assistance. Such a robot might prompt our pity – or our cruelty – in a way that goes beyond responses to fictional characters.

The 'Google of Russia' has built a fleet of self-driving food delivery robots

  The 'Google of Russia' has built a fleet of self-driving food delivery robots Yandex, the company behind Russia's biggest search engine, announced Friday that it's bringing out autonomous food delivery robots. The robots are currently just being trialled at Yandex HQ, but the company's hope is to integrate them into its food-delivery platforms. The food-delivery space is becoming more crowded, with players like Amazon and Starship Technologies having introduced their own self-driving delivery robots. Visit Business Insider's homepage for more stories. Russia's answer to Google, has announced it's bringing out self-driving delivery robots.

Abuse of humanoid robots can be disturbing and expensive, but there may be a solution, said Ms. Wykowska, the neuroscientist. He said that “kids have this tendency of being very brutal to the robot , they would kick the robot , they would be cruel to it , they would be really not nice,” she recalled.

Robots cannot consider feelings - genuinely, like a person can . Today millions of companies settle for terrible IVR experiences, keeping their customers at bay - never If a robot tells you , "I am sorry for your trouble, I can imagine you must be very upset" - that doesn' t make you feel better, does it ?

In this way, it perhaps makes sense that cruelty or kindness towards social robots could encourage cruelty or kindness towards sentient beings, even when we know robots feel nothing.

Sentient animals, which have minimal legal protections, may be especially vulnerable to this effect. But humans may also be at risk.

If social robots could shape our characters in significant ways, it may be young children who are most affected, as childrens’ characters are especially impressionable.

Two sides of the argument

Some experts believe robots could indeed make us crueller. Consider this argument.

Humans tend to subconsciously attribute sentience (feelings) to robots. Our treatment of these robots can then influence our treatment of other living creatures.

This argument resembles philosopher Immanuel Kant’s claim of a link between animal and human cruelty. Kant said:

If a man is not to stifle his own feelings, he must practice kindness towards animals, for he who is cruel to animals becomes hard also in his dealings with men.

Just as we have animal anti-cruelty laws, some say we’ll soon need robot anti-cruelty laws.

Others are more sceptical. After all, there is no conclusive evidence that enjoying violent movies and video games breeds violence towards others.

MIT made an army of tiny, 'virtually indestructible' cheetah robots that can backflip and even play with a soccer ball — see them in action in this new video

  MIT made an army of tiny, 'virtually indestructible' cheetah robots that can backflip and even play with a soccer ball — see them in action in this new video MIT's Mini Cheetah robots are small quadrupedal robots capable of running, jumping, walking, and flipping. In a recently published video, the tiny bots can be seen roaming, hopping, and marching around a field and playing with a soccer ball. They're not consumer products, but MIT hopes that the Mini Cheetah's durable and modular design will make it an ideal tool for researchers. Visit Business Insider's homepage for more stories. Boston Dynamics may have made a name for itself by posting videos of its surprisingly lifelike animal-themed robots, but don't count out the Massachusetts Institute of Technology.

I make mistakes, I am out of control and at times hard to handle. But if you can ' t handle me at my worst, then you sure as hell don' t deserve me at my best.” “ You 've gotta dance like there's nobody watching, Love like you 'll never be hurt , Sing like there's nobody listening, And live like it 's heaven on

This robot can decide whether or not it wants to hurt you . Despite all the benefits of robots in society, performing dangerous industrial tasks and going to places humans can ’t or won ’ t go, there are still “The decision to hurt a person ,” Reben said to The Washington Post, “happens in a way that I

Moreover, robots are not sentient and lack feelings. But, while some may argue it’s therefore impossible to be “cruel” or “kind” to them, this isn’t entirely obvious in instances where they can struggle, flee, protect themselves, and ask for assistance.

We should hope social robots encourage kinder actions in humans in general, rather than crueller ones.

Perhaps robot anti-cruelty laws are excessive in a liberal society.

But as robots increasingly become a part of our lives, often making decisions without human control, we have good reason to monitor the influence they have on us.

Simon CoghlanUniversity of MelbourneBarbara Barbosa NevesMonash UniversityJenny WaycottUniversity of Melbourne

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Scientists believe programming AI for self-preservation could be the key to giving robots feelings .
A new paper from researchers at the University of Southern California's Brain and Creativity Institute considers a novel path toward creating robots with 'feelings.'The key, according to researchers Kinson Man and Antonio Damasio, is homestasis, a self-preservation principle by which living creatures seek to maintain internal biological equilibrium by avoiding certain environments or kinds of stimuli.

—   Share news in the SOC. Networks
usr: 4
This is interesting!