请登录后使用此功能。 您可以使用此功能将商品添加到您的收藏列表。
关闭
您已经添加该商品到您的收藏列表。 查看我的收藏
关闭
从您收藏列表中删除此商品。
关闭
请登录后使用此功能。 您可以使用此功能将公司添加到您的收藏夹列表。
关闭
这家公司已成功添加。 查看我的收藏
关闭
这家公司已从你的收藏夹列表中删除。
关闭
请登录后使用此功能。 您可以使用此功能将公司添加到您的询问车。
关闭
这家公司已被添加到您的询问车。
关闭
这家公司已从询价车中删除。
关闭
该商品已被添加到您的询问车。
关闭
该商品已经从您的询价车中删除。
关闭
商品/公司已达到添加至询价车的数量。
关闭
Fictron Industrial Supplies Sdn Bhd
Fictron Industrial Supplies Sdn Bhd 200601019263

Home Robot Control for People With Disabilities

18-Apr-2019

Robots provide an opportunity to enable people to exist safely and comfortably in their homes as they grow older. In the near future (we’re all hoping), robots will be able to help us by cooking, cleaning, doing chores, and generally taking care of us, but they’re not yet at the point where they can do those sorts of things autonomously. Putting a human in the loop can help robots be useful more quickly, which is specifically significant for the society who would profit the most from this technology—specifically, folks with disabilities that make them more reliant on care.
 
The interface is structured around a first-person perspective, with a video feed streaming from the PR2’s head camera. Augmented reality markers show 3D space controls, provide visual estimates of how the robot will move when commands are executed, and also give feedback from other nonvisual sensors, like tactile sensors and obstacle detection. One of the finest difficulties is how to sufficiently represent the 3D workspace of the robot through a 2D screen, but a “3D peek” feature overlays a Kinect-based low resolution 3D model of the environment around the robot’s gripper, and then simulates a camera rotation. To keep the interface obtainable to users with only a mouse and single-click control, there are many different operation modes that can be selected, including:
 
Looking mode: Displays the mouse cursor as a pair of eyeballs, and the robot looks toward any point where the user clicks on the video.
 
Driving mode: Allows users to drive the robot in any direction without rotating, or to rotate the robot in place in either direction. The robot drives toward the location on the ground indicated by the cursor over the video when the user holds down the mouse button, and three overlaid traces show the selected movement direction, updating in real time. “Turn Left” and “Turn Right” buttons over the bottom corners of the camera view turn the robot in place.
 
Spine mode: Displays a vertical slider over the right edge of the image. The slider handle indicates the relative height of the robot’s spine, and moving the handle raises or lowers the spine accordingly. These direct manipulation features use the context provided by the video feed to allow users to specify their commands with respect to the world, rather than the robot, simplifying operation.
 
Left-hand and right-hand modes: Allow control of the position and orientation of the grippers in separate submodes, as well as opening and closing the gripper. In either mode, the head automatically tracks the robot’s fingertips, keeping the gripper centered in the video feed and eliminating the need to switch modes to keep the gripper in the camera view.
The grippers also have submodes for position control, orientation control, and grasping. This kind of interface is not going to be the fastest way to control a robot, but for some, it’s the only way. And as Henry says, he’s patient.
 
In a study of 15 disabled participants who took control of Georgia Tech’s PR2 over the Internet with very little instruction (a bit over an hour), this software interface proven both easy to use and effective. It’s truly not fast—simple tasks like picking up objects took most participants 5 minutes when it would take an able-bodied person 5 seconds, but as Kemp and Phillip Grice, a recent Georgia Tech Ph.D. graduate, point out in a recent PLOS ONE paper, “for individuals with profound motor deficits, slow task performance would still increase independence by enabling people to perform tasks for themselves that would not be possible without assistance.”
 
A split study with Henry, considered to be an “expert user,” demonstrated how much opportunities there is with a system like this:
 
Definitely, a PR2 is probably overkill for many of these tasks, and also not likely to be around to most people who could use an assistive robot. But the interface that Georgia Tech has developed here could be applied to countless different kinds of robots, including lower-cost arms (like UC Berkeley’s Blue) that wouldn’t necessarily need a mobile base to be effective. And if an arm could keep someone independent and comfortable for hours instead of a human caretaker, it’s possible that the technology could even pay for itself.



This article is originally posted on Tronserve.com
总办事处

Fictron Industrial Supplies Sdn Bhd 200601019263
No. 7 & 7A, Jalan Tiara, Tiara Square, Taman Perindustrian Sime UEP, 47600 Subang Jaya, Selangor, Malaysia.

电话:
传真:

邮件:
网址: http://www.fictron.biz
网址: https://fictron.newpages.com.my/
网址: https://fictron.n.my/
网址: http://fictron.newstore.my/

其他办事处

Fictron Industrial Automation Pte Ltd
140 Paya Lebar Road, #03-01, AZ @ Paya Lebar 409015, Singapore.

电话:
邮件:

游览 : 首页 - 分类 - 公司 - 地区 - 标签 - 商品 - 消息与促销 - 工作征聘 - 手机版 - 谷歌 - 搜索引擎优化结果

NEWPAGES

  • BR 28460
  • US 8998
  • AR 3735
  • CN 2554
  • MY 2530
  • AO 2081
  • EC 1934
  • DE 1634
人 在线
Seni Jaya Logo
Brochure
Download
Our PackageContact Us