• user warning: Table 'erric_new.upload' doesn't exist query: SELECT * FROM files f INNER JOIN upload r ON f.fid = r.fid WHERE r.vid = 28 ORDER BY r.weight, f.fid in /home/erric/public_html/modules/upload/upload.module on line 561.
  • user warning: Table 'erric_new.captcha_points' doesn't exist query: SELECT module, captcha_type FROM captcha_points WHERE form_id = 'user_login_block' in /home/erric/public_html/sites/all/modules/captcha/captcha.inc on line 60.

Ambient Intelligence Lab

The AmI laboratory is located in the EF 210 room of the Faculty of Automatic Control and Computer Science. Its surface is of approximately 35 sq.m. and it is equipped with:

  • 9 Microsoft Kinect sensors (the 3D depth sensing equipment that is usually shipped in combination with XBox), used for RGB imagery at 1280 x 1024 resolution and depth imagery at 640 x 480 resolution
  • one Samsung SNP-3120V camera, with PTZ (Pan-Tilt-Zoom) and 360 degree field of view
  • 20 Arduino Mega boards containing:

◦  Atmel AtMega 2560 8-bit processor running at 16 MHz

  WiFly shield that enables wireless communication (802.11 b/g)

 Sharp infrared proximity sensor (sensing range: 20-150 cm)

   Electret Microphone

◦  Humidity and temperature sensor

◦ Ambient light sensor

For the computing equipment, we're using a number of 12 servers grouped accordingly:

  • 7 servers for data crunching (ranging from quad-core Xeon with 16GB RAM to Core2Duo with 4GB RAM)
  • 5 servers for data acquisition (for pulling data in from the sensors)
  • a Zyxel GS 1100-24 Gigabit switch for connecting all these servers

 Most of the equipments are grouped into 9 T-shaped keypoints numbered

K1, K2, ... K9, each of them containing one Kinect and 2 Arduino equipments.

The actual physical layout of a keypoint is the following (k = Kinect, a = Arduino):

Throughout the room, the 9 keypoints are arranged according to the following schematic:

An actual picture from the laboratory showcasing a keypoint is the following:

 The main foci of the ongoing projects are:

  • distributed multi-modal tracking of a single elder person (via RGB image analysis and sound source detection)
  • activity detection
  • companion robots
  • distributed exploration of the environment
Below you can watch a movie clip from the lab (click on the image).
Ami Lab