Health & Fit Artificial intelligence could improve psychiatric care

19:40  14 november  2019
19:40  14 november  2019 Source:   popsci.com

A.I. technology could identify those at risk of fatal heart attacks, research claims

A.I. technology could identify those at risk of fatal heart attacks, research claims The system could have a significant impact in the near future.

Across fields of medicine, researchers and doctors are looking to artificial intelligence and machine learning to help them evaluate and diagnose patients, with the hope that the technology might speed the process, and help pick up on signals and patterns that aren’t as readily apparent to the human eye or brain. In the field of psychiatry, which usually requires conversations with patients to make decisions around care, it has the potential to augment care.

a close up of a man: AI can help boost psychiatric patient care.© Provided by Bonnier Corporation AI can help boost psychiatric patient care.

“We’re working on how to analyze patient responses,” says Peter Foltz, a research professor at the University of Colorado’s Institute of Cognitive Science. “Currently in mental health, patients get very little interaction time with clinicians. A lot of them are remote, and it’s hard to get time with them.” To chip away at that problem, Foltz and his team are working to build applications that could collect and analyze data about individuals’ mental states and report them back to clinicians.

Artificial Blood Could Potentially Fit With All Blood Types

  Artificial Blood Could Potentially Fit With All Blood Types Japanese scientists have been able to develop the early stages of an artificial blood that could mimic any blood type.Scientists from Japan may have found the answer, although that answer is still in its very early stages. That’s because they have managed to develop what’s called as “artificial blood,” which in theory, can fit any patient’s blood type. This means that blood transfusions no longer have to look for similar blood types since the artificial blood is essentially a one-size-fits-all type. Furthermore, it can also help fill the need of the current market for blood transfusions. According to the World Health Organization, 117.

Tools like this aren’t designed to replace doctors and psychiatrists, he stresses—just to further improve their care. And as research into their role continues to develop, it’s equally important to devote attention to the best way to build trust in their contributions. “In order to really be able to do this, there needs to be a greater understanding from laypeople and the psychiatric community on what artificial intelligence can do, what it can’t do, and how to evaluate it,” he says.

In a new paper published this week, Foltz and his colleagues outlined a framework that they hope can establish that trust. It highlights three key goals for artificial intelligence in psychiatry to strive for: explainability, transparency, and generalizability. “We really see those as pillars that psychiatry needs to think about if we’re saying we want to apply AI in the field,” he says.

Nurse claims co-workers gave patients Benadryl to make them drowsy

  Nurse claims co-workers gave patients Benadryl to make them drowsy Patricia Moran claims that several co-workers in the hospital's adult psychiatric unit gave patients Benadryl to "lighten the workload" for night-shift nurses.

Artificial intelligence can be a black box, and any program that aims to be used clinically should come along with information about how it was built and what data it was trained on (transparency), and clinicians should be given as much information as possible about how the program arrived at any decision it spits out the other end (explainability).

“When a machine makes a prediction, what is it making its predictions on?” Foltz says. “We want to have people understand how could this be used, how does it get those results, and what those results mean.”

Artificial intelligence programs are first trained on a specific set of data with a known diagnosis or designation, and then use what they’ve learned from that set to make decisions about new and unknown information. However, the programs are often limited by the specific population it was trained on. “We want to ensure that validation is done across a wide population, in order to ensure it can be generalizable for other areas outside the population its trained on,” Foltz says.

The pros and cons of Daylight Saving Time

  The pros and cons of Daylight Saving Time The German Empire and Austria-Hungary were the first to implement Daylight Saving Time (DST) in 1916, followed shortly after by several countries around the world. In the United States, DST became standardized in 1966 when President Lyndon Johnson signed into effect the Uniform Time Act. Since then, the benefits and detriments of daylight saving have been heavily debated.

Those principles are important for other areas of medicine, as well. In psychiatry, though, artificial intelligence has the potential to open a bottleneck: Conversations with patients have always needed to be interpreted by humans, but now, some of that may be done by machines.

Foltz’s team is working on applications that can record information from open-ended questions to patients and analyze speech patterns to learn about their mental state. “We’re looking at how they say things, and components of what they’re saying,” he says. “We can see how coherent it is, how well they’re staying on topic, how big their jumps are from one topic to another, and the structure of their language.” Preliminary results show that the program can interpret a patient’s mental state at least as well as a clinician listening to the same recorded answers.

The team is working to refine their measurements, and see how the tool could be applied to a range of mental health conditions, from schizophrenia to mild cognitive impairment. Fotlz says, though, that it will likely be a while before these types of programs are used clinically.

On the front lines of mental health care, emergency rooms are adapting

  On the front lines of mental health care, emergency rooms are adapting A hospital system's program aims to more swiftly deliver treatment for psychiatric patients while also reducing wait times and return visits. The move represents one way hospitals across Chicago and the country are adjusting to a cultural shift in care due to an increase in mental health patients in emergency rooms, said Chris Novak, vice president and chief operating officer for Amita Health’s behavioral medicine service line.

“The timeline is pretty far out, probably in the five- to 10-year range. Some of that is from the need to do more research and refine research, and some is running larger studies to test generalizability,” he says. “We’re still figuring out how this works as a tool for being able to monitor patients.”

Gallery: 13 Facts About Mental Health You Need to Know (Provided by Redbook)

a woman sitting in a chair talking on a cell phone: Mental illness is something that likely affects everyone in some way, whether you are someone suffering from mental health issues or you have a close friend or family member who is. Still, there's a stigma surrounding mental health problems that makes many of us ashamed to discuss them, although, in recent years, more and more people have opened up about the mental health issues they face. As a society, though, we have a long way to go when it comes to how we treat mental illnesses. One way to work on this is to truly understand mental health: what causes problems, what can make things better, and how many people are actually experiencing these issues. Click through to learn key facts about mental health you need to be aware of.

Instagram Will Start Using Artificial Intelligence To Combat Bullying .
The app is expanding its anti-bullying feature to include warnings for in-feed photo and video posts as well. require(["medianetNativeAdOnArticle"], function (medianetNativeAdOnArticle) { medianetNativeAdOnArticle.getMedianetNativeAds(true); }); “In our continued effort to lead the industry in the fight against online bullying, we are launching a new feature that asks people to reflect on a post that may contain bullying before it’s posted,” a spokesperson from Facebook, which owns Instagram, told Refinery29.

—   Share news in the SOC. Networks

Topical videos:

usr: 1
This is interesting!