•   
  •   
  •   

Technology Watch This Disney Robot Make the Most Convincing Eye Contact Ever

00:15  17 april  2021
00:15  17 april  2021 Source:   popularmechanics.com

Netflix and Amazon lose market share - due to exclusive rights from Disney

 Netflix and Amazon lose market share - due to exclusive rights from Disney Disney Plus has the same market share as Sky Ticket in Germany after one year. © Shutterstock Disney Plus. With Disney Plus, Disney has established itself as one of the four big players in the German streaming market in less than a year. Amazon, Netflix and Sky each lost market share compared to the previous year. This emerges from figures from the analysis company Goldmedia, in which the market shares of the various video streaming subscriptions in Germany are listed. Golem.de received these n

While most humanoid robots generally focus their eyes on a human face and stay there, that’s not how people interact with each other (except on Zoom calls, maybe). The Disney Research team explained in its paper Realistic and Interactive Robot Gaze: Gaze has been shown to be a key social signal For example, people who make more eye contact with us are perceived to be similar to us, as well as more intelligent, conscientious, sincere, and trustworthy. Furthermore, gaze appears to also convey complex social and emotional states. Given the importance of gaze in social interactions as well as its

Disney Research unveiled an audio-animatronics human bust with a lifelike gaze that eerily mimics a real person's eye and head movements. Audio-animatronics are Disney 's animatronics figures used throughout its theme parks to serve as lifelike characters. According to the study announcing this new eye gaze capability, these widely used animatronic figures imitate real life by using "fluid motions." Disney Research. The robot relies on a camera sensor on its torso. The robot can mimic human-like actions like tilting its head, blinking, rapid eye movements, and breathing, to name a few functions.

Disney Research scientists recently created a system for a more realistic robot gaze, and it's seriously lifelike. © Disney Research Disney Research scientists recently created a system for a more realistic robot gaze, and it's seriously lifelike.
  • Disney Research scientists recently created a system for a more realistic robot gaze.
  • The team demoed the technology with a humanoid animatronic bust.
  • To make interactions with the robot more realistic, the team programmed movement into not only the eyes, but also the neck and eyebrows.

Disney World could one day feature some of the most realistic animatronic characters on the planet, making your stay that much more magical. Imagine robots that can accurately follow your gaze while talking to you, raise their eyebrows, and even break eye contact like any other stranger periodically would.

Disney Plus Price Increase: All the Reasons to Sign Up Now and Save

  Disney Plus Price Increase: All the Reasons to Sign Up Now and Save Movies, shows and specials are all streaming on Disney Plus, and you can lock in the lower price -- if you act quickly.On March 26, a Disney+ membership will increase $10, jumping from $70 to $80 for an annual subscription. For monthly subscribers, the price will increase $1, from $7/month to $8/month.

Robots with human-like expressions are becoming ever more impressive, but Disney Research might just induce some nightmares with its latest project. Gizmodo reports that Disney has developed a system that gives humanoid robots more realistic gazes and head movements instead of the unnerving stares you often get from automatons. The appeal for Disney itself is fairly clear. It could use this human-like gaze for more convincing animatronics at its theme parks, at least once it finds a skin to avoid terrifying park guests.

Continue Watching . or. Start From The Beginning. You may also like: Watch NOW.

????You like bad*** robots. So do we. Let's nerd out over them together.

Scientists at Disney Research, the network of labs supporting the company's technological endeavors, have recently devised a new system for creating a lifelike robotic gaze.

By introducing minute"secondary behaviors" that humans exhibit in a conversation—from the flicker of the pupils between focal points, to the faint tilt of the head—the team managed to craft a machine that feels sort of human. The scientists presented their paper at the International Conference on Intelligent Robots and Systems last fall.

In effect, the humanoid robot comes off as incredibly lifelike, despite its face being mostly uncovered, exposing the electronics beneath. For now, that's fine; Disney artists can enhance the face later, Doug Fidaleo, director of Disney Research Los Angeles, tells Pop Mech.

Disney+ is raising prices for all its subscriptions — here’s how you can still save

  Disney+ is raising prices for all its subscriptions — here’s how you can still save The basic plan will start at $8 per month going forward. Disney+ has had a huge March. With WandaVision and The Falcon and the Winter Soldier becoming two of the most talked-about series in the past few weeks as well as Raya and the Last Dragon debuting on the platform with the first Disney princess of Southeast Asian descent, it's almost impossible to avoid Disney+ in pop culture lexicon of late.

Disney + Account Sign In. Please enter your email and password log in credentials to start streaming movies and TV series from Disney + streaming.

Disney Research has created an animatronic robot that can play catch and juggle balls with a human partner. The robot has a modified hand to make catching easier. Darren's love of technology started in primary school with a Nintendo Game & Watch Donkey Kong (still functioning) and a In high school he upgraded to a 286 PC, and he's been following Moore's law ever since.

Fidaleo's team is responsible for the hardware and software that could one day appear in Disney's proprietary"Audio-Animatronics" figures, which the company uses to create repeatable live shows and experiences (like"It's a Small World" and its 300 Audio-Animatronics dolls). So far, the results have been pretty convincing.

"I know the first time that I sat in front of [the robot], [I got] a little nervous, because you actually believe that this thing is alive," Fidaleo says."That threshold of feeling something, nervousness or something, is critical."

It takes some serious finesse on the software side to build that sense of realism. The engineering team places most of the emphasis on transitions and blending, so from one moment to the next, there aren't any real hard stops that might give away the animatronic figure's true robot identity, effectively breaking up the experience.

Coding for robots: Need-to-know languages and skills

  Coding for robots: Need-to-know languages and skills KODA advising CTO John Suit discusses the skills and languages that are important for developers who want to build software and systems for modern robots.Robots come in all shapes and sizes, but typically, they're autonomous devices that operate on their own to help us complete a task. If you're a developer looking to get into the growing field of robotics, what are the right skills to have? What languages should you know? In this episode of Dynamic Developer, we're going to talk with John Suit, advising CTO of KODA, who can answer these questions and more. The following is a transcript of the interview, edited for readability.

Back in the day, before fancy computer graphics, and technology that made everything a whole lot easier, animated movies were pretty much hand drawn, frame

IMDb is the world's most popular and authoritative source for movie, TV and celebrity content. Get personalized recommendations, and learn where to watch across hundreds of streaming providers.

If the robot is looking at one person, for instance, and then another child walks up to it, the animatronic figure can sense that with its onboard RGB camera, says James Kennedy, a research scientist with Disney Research Los Angeles. From the perception side, the robot gets a new set of coordinates to look at, and it will slowly transition its gaze to focus on that second child.

"Now, you have to make the decision of,'How do I go from where I am currently facing and get to these new coordinates?' And, you know, as a robot, that's a very simple problem," Kennedy tells Pop Mech. "You can draw a straight line and do that. But that's not particularly believable. People don't move in that way. And so there are a lot of these small details that we do."

Through the programs his team has built, Kennedy says it's possible to dictate how long one glance should last before slowly blending into another motion. From there, the software is fit with rules for the acceleration and deceleration of the motors that control the robot's neck, face, and torso along a particular curve.

TV Stars Who Left Hit Shows

  TV Stars Who Left Hit Shows It’s never easy to say goodbye, even if it’s a television goodbye. While no TV show can last forever – well, Law & Order: SVU is in the running – the cast doesn’t always stay as long. Many actors choose to leave after their contracts end. Some give the networks a lengthy heads-up, while others just seem to up and vanish, leaving the writers to scramble (see below!). Regardless, the show must go on. One of the most talked about exits was in May 2017, when Sophia Bush decided to leave Chicago P.D. after season 4. She played the lead role of Detective Erin Lindsay on the NBC drama and was written out of the show with a cliffhanger – her character was offered a job in New York to join the FBI. Many thought this could mean she would pop up on Law & Order: SVU – another show written by Dick Wolf which has done a crossover with P.D. before – but over the years, it was made apparent that was not happening. In multiple interviews since her exit, Bush explained that she went to her bosses between the third and fourth seasons and told them if things didn’t “drastically” change, she would be leaving at the end of the year. Then, in December 2018, she opened up about the decision while on Dax Shepard’s “Armchair Expert” podcast.

For example, one of the most impressive features relates to saccades, or quick, simultaneous movements of the eyes between fixation points. Think about making eye contact during a job interview, when you're probably most aware of your body language. Your eyes don't remain static while looking at your future boss, but rather, they subtly dart back and forth.

So, if you wanted to have a staring competition with this animatronic bust, you'd probably win—and that's by design.

"It would be quite unnerving for [the robot] to fixate on a single point on your face," Kennedy says."It's something we could do technologically, but it would be quite unnatural to people."

The researchers have programmed the robot to sort of mimic what the people in its line of sight are doing, from tilting its head in sync with guests, to blinking, and even subtly"breathing." Engineers combine these motions in a few different states of being based on a"curiosity score" that records the number and type of stimuli in the surrounding environment.

In the robot's default"read" state, it uses eye motions that make it seem like the figure is reading a book at torso level. In the"glance" state, the robot takes a look at the person of interest and tilts its head in the appropriate direction, just like a real person might shift if you distracted them while reading.

In the"engage" state, which is triggered by an even higher curiosity score, the robot looks at the person of interest while turning its head. Finally, there's the"acknowledge" state, which the robot uses after it detects the person of interest is familiar, someone it has already interacted with.

This imparts the animatronic bust with what appears to be a strange sense of empathy. Humans, monkeys, and even birds have "mirror neurons" in the brain that fire when an animal sees another being performing the same action. Robots don't have a biological framework like this on which to rely, so these four states of being are a close second.

Fidaleo is careful to note Disney hasn't yet confirmed any future use cases for the realistic robot gaze.

Still, we do feel a little bit bad for the"It's a Small World" animatronic dolls, which haven't changed much since the 1960s. With these realistic robots on the horizon, they're in for some rough competition. Maybe it's time to start taking notes.

???? Now Watch This:

This bad*** Robot Uses Lasers To Slay 100,000 Weeds Per Hour .
Don't mess with this lean, mean, weed-killing machine. A new farming robot can weed up to 20 acres in a day, saving labor and chemicals.The"Autonomous Weeder" uses lasers, a supercomputer, GPS, and computer vision.Weeds are a costly nuisance for farms of all sizes, so the robot could drastically improve farmers' bottom lines. Farmers, prepare for the future: there's a new robot on the block that can slay 100,000 weeds per hour—all night, every night. © Image courtesy of Coco Kou/Carbon Robotics The Autonomous Weeder uses carbon dioxide lasers to annihilate 100,000 weeds per hour.

usr: 4
This is interesting!