• Business
  • Entertainment
  • Health
  • News
  • Sport
  • Tech
  • World
Newsy Today
news of today
Home - sign language
Tag:

sign language

Health

New embodied AI system teaches users complex movements via muscles

by Chief Editor April 10, 2026
written by Chief Editor

The Rise of ‘Embodied AI’: When Artificial Intelligence Feels Like a Gentle Hand

Imagine learning a novel skill – opening a tricky jar, using a foreign appliance, or even performing a delicate physical therapy exercise – and feeling a subtle, guiding force on your own muscles. This isn’t science fiction; it’s the emerging reality of “embodied AI,” a field poised to revolutionize how we interact with technology and learn new skills. Researchers at the University of Chicago, led by Yun Ho, Romain Nith, and Pedro Lopes, are at the forefront of this movement, recently earning a Best Paper Award at the ACM CHI 2026 conference for their groundbreaking work.

From Specialized Gadgets to Context-Aware Assistance

For years, electrical muscle stimulation (EMS) has been used in rehabilitation and physiotherapy, delivering electrical impulses to trigger muscle contractions. However, traditional EMS systems were limited – designed for specific tasks and unable to adapt to changing contexts. Attempting to utilize an EMS device programmed for shaking a spray can on a can of cooking oil would result in an inappropriate and unhelpful response. The new system developed by Ho, Nith, and Lopes overcomes this limitation by integrating AI to understand the user’s environment and intent.

From Specialized Gadgets to Context-Aware Assistance

This new approach leverages multimodal AI – combining computer vision and large language models – to generate muscle stimulation instructions tailored to the situation. The system doesn’t simply follow a pre-programmed routine; it “improvises” alongside the user, offering guidance based on what it “sees” and “understands.”

“I am curious about how people understand and build relationships with devices that communicate with them through body movements (rather than audio/visual). In ’embodied AI’, I got to explore this question in the realm of physical assistance. It was especially insightful to have participants “think aloud” as they used our system and learn how they interpret machine-induced movements.” – Yun Ho, PhD student, Department of Computer Science, University of Chicago

How ‘Embodied AI’ Works: It’s About ‘Know-How,’ Not Just ‘Know-That’

The key innovation lies in transmitting “procedural knowledge” – the intuitive understanding of how to perform a task – directly to the muscles. Instead of providing factual information, the system guides the body through the correct movements, enabling users to learn by doing. In user studies, participants successfully completed tasks like opening child-proof pill bottles and operating unfamiliar cameras with the assistance of dynamically generated muscle cues. Even when the AI made deliberate errors, users were able to adapt and correct the system, demonstrating a collaborative learning process.

View this post on Instagram

Beyond the Lab: Real-World Applications of Muscle Stimulation and AI

The potential applications of this technology are vast and span numerous industries:

  • Healthcare and Rehabilitation: Assisting patients with physical therapy exercises at home, providing guidance on proper biomechanics.
  • Industrial and Skilled Labor: Guiding workers through new equipment procedures, reducing injury risk and accelerating training.
  • Accessibility: Providing direct bodily guidance to blind or low-vision users, making environments more accessible.
  • Everyday Life: Assisting with unfamiliar tasks, from operating foreign appliances to assembling gadgets.

Lopes emphasizes that while current limitations exist – including electrode calibration and the sensation of EMS – rapid advancements in both AI and EMS hardware are paving the way for more comfortable and user-friendly systems.

The Future of Human-Machine Collaboration

This research isn’t about replacing traditional instruction; it’s about augmenting it. The system is designed to complement audiovisual guidance, enriching the learning experience by engaging the body directly. The research team has open-sourced their code, encouraging further development and innovation within the community.

As the field evolves, ethical considerations – such as user control and safety – are paramount. The researchers have prioritized user agency, ensuring that the AI only acts when invited and that participants can interrupt or adjust the guidance at any time.

Frequently Asked Questions

Q: What is ‘embodied AI’?
A: It’s a new approach to human-computer interaction that uses artificial intelligence and electrical muscle stimulation to physically guide users through tasks.

Q: How does this differ from traditional EMS?
A: Traditional EMS is task-specific, while this new system adapts to the user’s context and provides dynamic guidance.

Q: What are the potential benefits of this technology?
A: It could improve learning, rehabilitation, accessibility, and performance in a wide range of tasks.

Q: Is this technology readily available to consumers?
A: Not yet. We see currently in the research and development phase, but progress is being made rapidly.

Did you know? The University of Chicago team’s work on SplitBody, a related project focusing on reducing mental workload during multitasking via muscle stimulation, received a Best Paper Award at ACM CHI 2024.

Pro Tip: The success of ‘embodied AI’ hinges on creating comfortable and easily calibrated EMS hardware. Expect significant innovation in this area in the coming years.

Interested in learning more about the intersection of AI and human augmentation? Explore recent publications from Yun Ho and Romain Nith on Yun Ho’s website and Romain Nith’s website.

April 10, 2026 0 comments
0 FacebookTwitterPinterestEmail
News

New York metro transit systems add on-demand sign language interpreters

by Chief Editor February 10, 2025
written by Chief Editor

The Future of Accessibility in Public Transit: A Positive Jump Forward

New York’s bustling transit network is once again setting trends in accessibility, bridging communication gaps for the deaf community. The partnership between Convo and the MTA introduces an innovative solution that leverages QR codes to provide real-time sign language interpretation via mobile devices. This initiative not only represents a significant leap in accessibility but also hints at future trends in public transportation and communication technologies.

Leveraging Technology for Enhanced Accessibility

The introduction of QR codes across key transit stations, from Moynihan Train Hall to Penn Station, opens the door to enhanced, real-time communication. For individuals like Jarrod Musano, a deaf traveler, this technological advancement eliminates barriers, allowing seamless interactions despite power outages or transit delays. As users scan the QR codes, they gain immediate access to sign language interpreters, enabling intuitive communication in real-time—a much-needed service that ensures inclusivity in one of the world’s most renowned transit systems.

The Role of Mobile Connectivity

Integrating technology into public services necessitates reliable mobile connectivity. Currently, the MTA is expanding Wi-Fi coverage throughout its network—an essential move given the challenges users might face with connectivity in spaces like Moynihan Train Hall. With T-Mobile users often experiencing fluctuating signals in such environments, the reliability of these networks becomes crucial to maintaining seamless communication channels, especially when deploying services like Convo.

Sign Language Interpretation: A Preferred Solution

Convo’s solution, using sign language interpreters over typing messages into devices, takes into account the native preferences of many in the deaf community. Sign language offers not only a more natural form of communication for native users but also addresses the intricacies of syntax and contextual interpretation, making interactions more efficient and reducing the frustration associated with cumbersome alternative methods. This emphasis on interpreting real-time communication could pave the way for broader language support in the future, ensuring access to a wider variety of dialects and languages.

Global Implementation and Partnerships

While Convo’s focus currently spans the U.S. stations, its global footprint with significant partnerships in the UK, Australia, and other countries showcases the scalability of such solutions. Convo’s collaboration with British Airways, Aer Lingus, and the British railway system highlight its potential to transform accessibility standards worldwide. As this service continues to evolve, the insights gathered from various deployments will inform enhancements and new language capabilities that have worldwide implications.

Prospecting Future Trends

Looking ahead, the intersection of accessibility technology and public infrastructure will likely witness broader adoption of real-time translation services and AI-driven augmentations. Their integration points to a future where travel becomes more inclusive, smooth, and accommodating for diverse needs. Innovations in AI and machine learning could potentially allow for instant translations of many languages, creating travel experiences that feel personalized and considerate of individual needs.

FAQs on Transit Accessibility

  • What is Convo?  

    Convo is a company providing real-time sign language interpretation services through its app, Convo Now.

  • How does Convo’s service in New York and New Jersey work?  

    Users scan a QR code, granting access to their camera and microphone to connect with a live interpreter for up to 20 free minutes per month.

  • Why is real-time interpretation preferred?  

    It allows for more natural and efficient communication, especially important as sign language is a native language for many in the deaf community.

  • What languages are supported now?  

    Currently, the service supports American Sign Language, with potential expansion to other languages.

Did you know? Real-time interpretation can significantly reduce communication delays and enhance understanding during transit disruptions.

Call to Action

Young professionals, travelers, and advocates for accessibility—have you experienced these hospitality advancements firsthand? Share your stories in the comments or explore our other articles on tech innovations transforming daily life. Subscribe to our newsletter for more insights and updates on how technology is reshaping the world of accessibility.

February 10, 2025 0 comments
0 FacebookTwitterPinterestEmail

Recent Posts

  • Gaggenau installation offers “refuge for the design community”

    April 21, 2026
  • Estonian Politicians Criticize Zelenskyy Over Russian Threat Claims

    April 21, 2026
  • Hungary’s Legal Dilemma Over Netanyahu ICC Arrest Warrant

    April 21, 2026
  • Berrettini e Insulti sui Social: Il Caffè Mai Preso

    April 21, 2026
  • Ruch Chorzów vs Wisła Kraków: Record Attendance and Stadium Dilemma

    April 21, 2026

Popular Posts

  • 1

    Maya Jama flaunts her taut midriff in a white crop top and denim jeans during holiday as she shares New York pub crawl story

    April 5, 2025
  • 2

    Saar-Unternehmen hoffen auf tiefgreifende Reformen

    March 26, 2025
  • 3

    Marta Daddato: vita e racconti tra YouTube e podcast

    April 7, 2025
  • 4

    Unlocking Success: Why the FPÖ Could Outperform Projections and Transform Austria’s Political Landscape

    April 26, 2025
  • 5

    Mecimapro Apologizes for DAY6 Concert Chaos: Understanding the Controversy

    May 6, 2025

Follow Me

Follow Me
  • Cookie Policy
  • CORRECTIONS POLICY
  • PRIVACY POLICY
  • TERMS OF SERVICE

Hosted by Byohosting – Most Recommended Web Hosting – for complains, abuse, advertising contact: o f f i c e @byohosting.com


Back To Top
Newsy Today
  • Business
  • Entertainment
  • Health
  • News
  • Sport
  • Tech
  • World