Motion Capture Lab

Swarm Robot in the Motion Capture Lab.

Inside the SMART Motion Capture Lab, students create a simulated environment to demonstrate how autonomous air and ground robots can work together. Photo: Douglas Levere

Initial Reservations

Reservations for the facility will follow this process:

  • Reservations must be initiated by faculty members as they are best positioned to determine the time-sensitive nature of the research.  
  • Requests for access (which must be made at least 2 business days in advance) shall be sent to the SMART email address (smart-coe@buffalo.edu).
  • Prior to confirming a reservation, prospective users (students and faculty) must complete orientation
  • After receipt of the agreement, the reservation will be confirmed and swipe access requested.

About the Motion Capture Lab

The 1500 sqft SMART Motion Lab is a multipurpose, rapidly reconfigurable facility for small scale production / manufacturing, experiential learning opportunities, and as an authentic test environment for robotics and advanced manufacturing technologies.  Equipped with a 250 ft observation deck, this high bay facility provides an authentic manufacturing environment with the added capability of observing and recording activities in the space.

Who is this facility open to?

Faculty • Staff • Students • Researchers at other academic institutions • Government and industry

Facility Information

Location:
126 Bonner Hall, North Campus

Ken English
Co-Director, SMART Community of Excellence
Email: kwe@buffalo.edu
Phone: (716) 645-2683

Equipment

Vicon Vantage V8 and Vero v2.2 Optical Motion Capture System

  • The Vicon Vantage V8 and Vero v2.2 optical capture cameras operate at 8.0 and 2.2 megapixel resolution up to 250 and 330 frames per second respectively
  • Better suited for dynamic movement capture of multiple autonomous rigid body object tracking
  • Higher resolution capability of the V8 cameras allows for better tracking of small markers placed close together on rigid bodies
  • Wide field of view capability with a 70 and 86 degree lens and IR strobe field of view capability respectively

Baxter Robot

Baxter is a collaborative robot designed and manufactured by Rethink Robotics. Baxter head a 12 sonar transducers distributed along the periphery of the head which act as proximity sensors. The head display acts as a face to express different facial expressions. Baxter has two arms with 7 degrees of freedom, each joint has Series Elastic Actuators with position and force sensors. These sensors can be used for collision detection while working in the proximity of human operators. The whole robot is mounted on four wheels for easy transportation.

The robot has a built in Zero Force Gravity Compensation mode which allows a user to teach the positions of the end effector without explicit programming. Thus, the robot can be controlled in a teaching model. Each arm has a maximum payload capacity of 5 lb (2.3kg). The onboard computation consists of 3rd Gen Intel Core i7-3770 Processor (8MB, 3.4GHz) w/HD4000 Graphics,  4GB, 1600MHZ DDR3 RAM, and 128GB Solid State Drive. The software development kit (SDK) interfaces with hardware through ROS (Robot Operating System. The connection from the workstation to the robot can be made using an Ethernet connection and the robot has an emergency shutdown switch. More details about the hardware and software can be found in the adjacent links.

UR3 Robot

UR3 is a collaborative table top robotic arm built by Universal Robotics. The robot has 6 degrees of freedom with all writs being revolute joints. Each joint has 360 degree working range and 1800/s maximum. The end wrist has infinite working range. The payload capacity is 3 kg / 6.6 lbs with maximum reach range of 500 mm / 19.7 in. The UR robot is equipped with a Safety stop when the robot meets a force that is 150 N. This force can be forced as low as 50 N.

The robot is also equipped with Robotiq 2-Finger Adaptive Robot Gripper which can provide feedback about part detection and object size. The control box for robot is equipped with the teaching pendant - Polyscope GUI interface which is used to control the robot. There are mainly two ways to program - either using Polyscope GUI interface using the touch screen teach pendant – or with UR script language. There is also a ROS interface using  python or C++. The robot uses Ethernet communication and has Real-time Data Exchange module working at 125 Hz for data exchange. More details about the hardware and software is given in following links.