A Touch of Light

A step towards enlightenment for those living in the world of darkness.

A Young India Fellowship Initiative in collaboration with University of Pennsylvania


      





Monday, October 1, 2012

Team viSparsh receives a cash grant of 2 lac INR

We are happy to announce that we have received a grant of  Rs. 2 lakhs as one of the winners of the Economic Times- Power of Ideas 2012 contest.The contest was held in association with Department of Science and Technology and the Centre for Innovation Incubation and Entrepreneurship, IIM Ahmedabad.  Thank you all for the support as we see our idea comes a step closer to the realization.

 

Monday, September 3, 2012

Our Business Idea reach finals of Economic Times-Power of Ideas 2012

As a next step to productize viSparsh and make it reach to the target mass at a subsidized price, Team viSparsh had made a Business Plan and applied in Economic Times- Power of Ideas Contest 2012. This contest being organized in association with Department of Science & Technology (DST) and Indian Institute of Management (IIM), Ahmedabad is India's Largest Entrepreneurship Development Programme.
We are excited as our business idea reached among top 75 ideas from all across India who would stand a chance to win seed funding. The semi-finals were held in Bangalore last month and 75 ideas have been shortlisted from among 504 semi-finalists and thousands of initial applicants.  The team has been provided an opportunity to go through an intensive 10 days mentorship workshop at IIM Ahmedabad to further refine and implement our business idea.
Thank you all who have been continuously supporting us from all around the world to start this next revolution in the field of assistive technologies.

The ET- Power of Ideas 2012 finalist list can be found at :

Thursday, August 16, 2012

viSparsh chosen among the top 12 finalists of 'The Wall Street Journal Asian Innovation Awards 2012'

From among 240+ entries, The Wall Street Journal today announced the top 12 finalists of Asian Innovation Awards 2012. We are thrilled to inform you that project viSparsh has been chosen as one among those finalists. Further, this year three of the top 12 innovation entries are from India compared to none in 2011. This surely indicates that India is making way ahead in terms of innovation and gives us a proud feeling being 'Young India Fellows' :)


Link: http://online.wsj.com/article/SB10000872396390444246904577574422852969682.html?KEYWORDS=asian+innovation+awards

Sunday, August 5, 2012

Team viSparsh meets Tony Scott (CIO, Microsoft)

It was a pleasant Thursday morning for us in the city of Hyderabad, India where Tony Scott ( Chief Information Officer, Microsoft) visited on 2nd August, 2012. Team viSparsh was highly excited on being given an opportunity to demo our project to him along with MD, MSIT India- Mr. Raj Biyani.
As planned, the demo went off well and Tony highly appreciated our rigorous efforts being put in the domain of assitive technology.
The idea of the project also resonated with Microsoft's aim of helping people and businesses throughout the world to actualize their full potential.
Tony also offered to extend his support for helping us achieve our vision of making this technology reach to the people in need.
Thank you Tony and Raj for your encouragement and support.

The way forward

viSparsh is one such project which has a huge potential to change the world, to make the world a better place, comes with vast future scope and infinite feature addition capacity. Thus, right from day one of this project, we have been thinking of ways through which this project can see light of the day by delivering on aspects like usability, design etc.
We recently had a meeting with our Professor at Penn and we discussed that this is the time when we should think of giving our prototype a viable shape.
Thus, if you are a tech design company who can help us solve the aspects like decreasing form-factor, improving on power efficiency and looking at other usuability aspects, we would love to hear from you. Kindly fill the contact form given on the right side bar to connect to us.

Monday, July 9, 2012

Re- set up of the Lab in Hyderabad

First Batch of YIF Programme came to a formal closure on 19th July, 2012. On receiving job offers from Microsoft, the entire team viSparsh has now shifted to Hyderabad, India. Although, viSparsh was only a part of the 8 months long internship programme at YIF, the team now plans to take it forward in the city of Nawabs. Due to the relocation, the entire lab is still in the process of shifting and we soon hope to start our work of making technology reach to the people in need.

We also have received tremendous support from Dr. Pramath Raj Sinha (Founding Dean, ISB) who was present in the town for the YIFP Hyderabad Chapter meet.
       YIFP Hyderabad Chapter Meet in June 2012

Team viSparsh visits Accenture Innovation Labs in France

As part of the winning prize of Accenture Innovation Jockeys Contest (powered by Yahoo), Team viSparsh got an opportunity to visit Accenture Technology Labs in Sophia Antipolis, France between 1st to 3rd July, 2012. The team at France got an opportunity to learn about various next Generation technologies which can be integrated in their project to make it more robust and functional. Thank you Accenture & Yahoo!
Fig. : Team viSparsh in France

 Fig.:  Tushar (team member, viSparsh) explaining project 'viSparsh'


Team viSparsh gets Best ELM Award at YIFP Convocation


Team viSparsh won the best Experiential Learning Module at the Young India Fellowship 1st Convocation. The event was held at JNU Convention Hall, New Delhi on 19th May, 2012. The team is highly excited on receiving this appreciation. The team is further motivated to work more hard and make technologies like these reach to the people in need.

Awesome day at McKinsey Office

As part of our Coaching process at the Young India Fellowship Programme, our Coach -Mr. Abhimanyu Puri invited us to his office on 10th May 2012 in Gurgaon. There we got a pleasant surprise as we got an opportunity to meet various senior leaders at McKinsey. We got many valuable feedbacks and guidance from them both about the personal and professional life. Thanks Abhi :)

Thursday, April 26, 2012

viSparsh wins the title of 'Accenture Innovation Jockeys' (Powered by Yahoo!)



After a series of levels, Team viSparsh has finally won the title of  'Innovation Jockeys 2012' (co-sponsored by Accenture and Yahoo!). The contest started n January 2012 to search for the most innovative minds across campuses in India. viSparsh won from among 1082 entries from 500+ colleges all across India. The finale was held at the Leela Palace, Bangalore on 24th April 2012. As winners, the team has got an opportunity to visit Accenture Innovation Labs at Sophia Antipolis, France in the coming months. The team also won the 'City Category' award.
Further information can be found at the the Accenture Innovation Jockeys Site. 
Team viSparsh winning the Grand Finale of Accenture Innovation Jockeys Contest (Powered by Yahoo!). The Prize being given by Ms. Rekha M.Menon, Executive Director - Geographic Services, India and ASEAN, Accenture


Team viSparsh winning the City Category in the same event.

As part of the Final Round, the team had an insightful interaction with the Jury Members from Yahoo R&D, HP Labs, Accenture Global Team and several others.







News: http://varindia123a.blogspot.in/2012/04/innovation-jockeys-of-2012-awarded.html ,
http://mail.varindia.com/mobile/Accenture_InnovationJockeys2012.htm ,
http://news444.com/a/1067179/innovation_jockeys_of_2012_awarded

viSparsh Day @YIFP

We celebrated  'viSparsh Day' at our Programme Campus on 18th April 2012. The occasion marked our gratitude to the Young India Fellowship Programme and our co-fellows who have helped us in the entire journey of viSparsh. We shared the success by hosting some games and organizing a small party in the campus itself. We wholeheartedly want to thank all of you who have been a constant support of encouragement for team viSparsh. We are highly motivated to keep up the progress work in the future and deliver a viable product in the future. Thank You All :)

Sunday, March 25, 2012

viSparsh Intro

viSparsh Experience at TechEd2012

viSparsh Prototype 2




We have finally been able to come up with the 2nd prototype of the belt as shown in the pic. Although, the size of the belt still looks similar to the previous version, we have done some major technical upgradations in the belt. The belt has now more robust power management system with single ON-OFF button and less components. Previous post provides the list of changes in this new prototype.
After making it functionally robust, our next aim is to decrease the size of the belt.



In the next version, we are trying to make a sleek design by considering various possibilities like replacing bigger PandaBoard with small BeagleBone, removing Kinect's plastic casing to use only the required components etc.
One of our main focus at present is to build a sleek design before adding more functionalities.




viSparsh @ TechEd2012


All ready with the second prototype of the belt, we showcased viSparsh at the TechEd2012 held at Bengaluru from 21st to 23rd March 2012.
TechEd is Microsoft's annual event to showcase state of the art technologies. We were given a booth to demonstrate our prototype.
This second prototype of the belt has following upgradations
- Less Bulky due to replacement of twelve 1.2V batteries with single 3 cell LiPo Battery.
- Single switch On-Off belt
- One major component mbed removed by directly running vibration motor through PandaBoard.
- Voltage Cut-off circuit integrated to prevent battery from damaging.
- Better Haptic feedback using the new vibration motors.

It was a fantastic experience demonstrating our project in such a big platform. The project was highly appreciated by the industry experts and others who visited the event.


Better Haptic Feedback using new vibration motors

In our first prototype of the belt, we were using ROB 08449 buzzers (left side of the image) to give haptic feedback to the visually impaired people.
The vibrations produced by them were perpendicular to axes of the belt and thus got significantly reduced as we taped them inside the belt. Therefore, we needed more powerful vibration motors for our purpose.
We have replaced the current motors with Pico Vibe 9mm Vibration Motor- 25mm type (RHS of the image) as it provides vibrations both in X and Y planes.
As could be seen in the pic, they are slightly bigger in size (25mm length and 8.8mm diameter) compared to 3.4 mm length of ROB-08449 motors. However, both operate at 3V and Pico Vibe produces 13,500rpm speed compared to 12000rpm of ROB-08449 buzzers. This is a significant improvement as our output depends on the efficiency of these vibrations. Further, instead of earlier 6 motors, we now have to use only 3 motors in the belt ( one for each zone- left, centre and right).

mbed Removed

We have finally succeeded in removing one major component from the belt. The initial design of the belt had mbed in it. We were using mbed to generate PWM (Pulse width Modulation) to be sent to the actuators (vibration motors in our case). However, we eventually figured out that the same functionalities can be provided by PandaBoard without incurring that extra component cost of mbed.

PandaBoard has on-board GPIO (General Purpose Input Output) pins that can be used to render multiple functionalities. We configured three GPIO pins as output and created three software threads to generate variable-duty PWM. This PWM further actuates the vibration motors to guide the visually impaired in obstacle-detection.

Removal of mbed is a major breakthrough for us as we are continuously working on decreasing the size of the belt to make a sleek wearable design.

Belt with 'Single Switch ON-OFF' feature


In line with the continuous up-gradations being done in the belt, the next is making the belt start with the press of a single button. The circuit deploys DPDT (Double Pole, Double Throw) switch along with the power cut-off circuit.
It now takes away the burden of making power connections everytime one had to wear the belt.


Belt upgraded with new Power Circuit


Power issues have been bothering us for quite some time. We were earlier using 4*1.2V batteries to power PandaBoard and mbed and 8*1.2 V for Kinect. This had made the design quite bulky and was taking unnecessary space. Thus, we decided to reduce the size by replacing these cells with 2000mAh, 3 cell LiPo battery which gives 11.1V. The pic above shows the integrated LiPo batteries in the belt.
Further, we have implemented the Power Cut Off Circuit to prevent 3 cell LiPo batteries from damaging whenever voltage drops below 9V.

We have built the voltage cut off circuit using details given in RC forum
The pic shows the implemented circuit.

Monday, January 16, 2012

viSparsh: A touch of light

We are continuously fine-tuning the belt according to the feedback obtained from various blind organizations. We have been visiting National Association for Blinds(NAB), Delhi and obtained valuable feedback from Mohd. Wasim.


Mohd. Wasim is blind since his birth. He really appreciated the concept of a haptic belt that makes his hands and ear free and gives a broader perception of obstacles in the path. Here is a brief video describing the concept and working of viSparsh-

Wednesday, January 4, 2012

Test Drive-2: Rigorous Testing

This is the second test drive of viSparsh belt. We prepared an obstacle course in a closed hall and and simulated a special test environment with dummies and different sort of obstacles. The obstacles were of different materials, heights and widths. The test results were quite encouraging.


There has been some significant changes from our last test drive- 
1. We have increased number of vibrators from three to six to give a better indication of obstacle to the user. 
2. An autorun script has been created and deployed which starts the viSparsh belt automatically when we switch it on.  


Here is a video snippet of this test drive-


Tuesday, December 6, 2011

Autorun

Although we finished the first phase of viSparsh. There were certain fine tunings required to make the belt wearable. The belt stills needed to be started by executing a file on ubuntu-terminal. To resolve that we needed an autorun file which automatically runs at startup and kickstarts the belt. This can be done by creating an initialization daemon. We have created this daemon and now the belt can run on its own. The detailed process is given below- 


STEP1: Create the autorun script
Create a new file 'auto_visparsh' at Desktop with the commands starting the viSparsh belt as content. Copy following command in this file. 

COMMAND:
cd /root/OpenNI/Platform/Linux-x86/Bin/Release
./SimpleRead.net.exe


STEP2: Copy this autorun file to /etc/init.d 
/etc/init.d contains all scripts which are run on startup and shutdown automatically. The '.d' in 'init.d' represents 'daemon'. Daemon are the processes which are executed in background.

COMMAND:
cd /etc/init.d
cp /home/visparsh/Desktop/auto_visparsh /etc/init.d


STEP3: Choose the run-level
Create a 'soft-link' of the autorun file in one of the run-level directory {rc0.d, rc1.d rc2.d rc3.d rc4.d, rc5.d, rc6.d}. For our purpose choose 'rc2.d'. Don't use 'rc0','rc1' and 'rc6' they contain shutdown scripts. We can also set priority of the process using 'SXX' where XX represents the priority {00<=XX<=99}. The letter 'S' represents 'Start' and can be replaced by 'K' to deactivate a script.   

COMMAND:
cd /etc/rc2.d
ln -s ../init.d/auto_visparsh S99auto_visparsh

STEP4: Restart
Now restart the board. This time belt will start automatically.

Tuesday, November 29, 2011

Test Drive-1: GET, SET……… GO!!!!!!!!!!


After the hard work of one and a half month finally we finished first phase of our project. We have designed the viSparsh v1.0. The belt includes a plastic box which contains PandaBoard, Mbed and Voltage Convertor and the Kinect fixed at the front of the belt. Currently we are using three vibration motors on three sides which we plan to increase to six in version 1.1.

We had our test drive today in our hostel and it worked exceptionally well with no false alarm. Some of the other fellows helped us in shooting the video and becoming the obstacles. Here is the video for this test drive-






Power Supply Issues

After powering the circuit from AC mains, it was time to switch on to batteries as the viSparsh belt would be mobile. So, we purchased a Chinese 12V 2100mAh battery pack to run Kinect and used 7805 voltage regulator to power mbed and PandaBoard with 5V. The batteries were drained out in a couple of minutes and situation got worst after recharging it. Hence, this Chinese battery proved useless for us.
We, then, used 8 AA size Kodak 2100 mAh rechargeable cells. Though it powered our setup for 15 odd minutes but not what we required. Then we got 8 more to power Kinect separately. Still, the power dissipation from 7805 was huge and we thought why not to eliminate that. As one cell is rated 1.2V (though it measures between 1.3-1.4), so we connect 4 cells in series to draw slightly more than 5V from it.
Here is the final number of cells we have used in the project:

1. Mbed:                  4 AA Size 1.2V 1000mAh          Output: Slightly more than 5V
2. PandaBoard:        4 AA Size 1.2V 2100mAh          Output: Slightly more than 5V
3. Kinect                  8 AA Size 1.2V 2500mAh          Output: Slightly more than 10V

Here, we were able to eliminate 7805 voltage regulator from our circuit and hence we minimized the power dissipation.We tested the belt for around half an hour and it worked perfectly with this configuration.
We may add 4 more cells in parallel to power PandaBoard.

Another option we may consider is to order a customized battery with our specifications.

Booting Ubuntu on Pandaboard EA3

STEP1: Download the Ubuntu image
Use this link to download the pre-installed image of ubuntu 10.10. 

STEP2: Prepare the SD card
Insert the SD card in your PC and unmount it. Then use this command to copy the pre-built image on the card.

COMMAND:
sudo sh -c 'zcat ./ubuntu-netbook-10.10-preinstalled-netbook-armel+omap4.img.gz|dd bs=4M of=/dev/sdb; sync'

STEP3: Download and copy required u-boot and MLO files
For Pandaboard version A2 and later follow these instructions. In our case we had a EA3 version so we followed it.  

2.      Untar with "tar -jxf panda.tar.bz2"
3.      Mount the first partition of the imaged SD card
4.      Copy MLO and u-boot.bin (extracted from the tar file) to the mounted partition.

STEP4: Boot from Pandaboard
Now unmounts the SD card and insert it in Pandaboard and give it the power. It shall start booting.

STEP5: Now use the same procedure to install OpenNI and Kinect that we used for BeagleBoard.



Saturday, November 26, 2011

Getting ready to release the 1st version




Huhuuu! After facing lots of troubles at every step, we were finally able to get the project working by seamlessly integrating all the components. The video above shows as to how Kinect is able to detect an object (human in our case) and give an alarm depending on the distance between the human and Kinect. In our case, we have used three vibration motors for producing these alert signals. Each motor is used to send an alert for obstacle in one particular direction namely, left, right or middle. These three motors would help a visually impaired person in identifying the direction in which the obstacle is present. We hope to transfer this functionality on the belt in a day or two.This would mark the end of first phase of our project where we were supposed to make a working prototype of the viSparsh Belt.

Wednesday, November 23, 2011

Solving the Response Time Problem


The response time of the data transmitted by the Kinect to the BeagleBoard came to around 4-5 seconds which was quite contrary to our expectations. By 'Response Time' we mean the 'time lag' between two samples of data transmitted by the Kinect to the BeagleBoard.

In order to solve this problem we tried running the same on the PandaBoard and Hurray! the problem got solved. Response time decreased to a few milliseconds. Though both have the 1Ghz processor but BeagleBoard has 512MB DDR RAM whereas PandaBoard has 1GB DDR2 RAM.

Active USB Hub: The way out for our Kinect Detection Problem



To our utter surprise, Kinect stopped working all of a sudden. On running Kinect sample programs, it was not able to detect the device as mentioned in one of our previous posts. However few days back, we were able to find the solution for this. After trying all possible combinations like the software error possibilities etc., we were finally left with trying out the Active USB Hub option. And Hurray!, it actually worked. The problem was that although Kinect runs on a 12V adapter , it still draws ample amount of current from the USB port to which it is connected. Thus, while connecting Kinect to BeagleBoard/PandaBoard, Kinect was not able to draw to draw sufficient amount of current from the board. Thus, on connecting Kinect through the Active USB Hub instead of the board directly, we were able to remove Kinect detection problem.

Saturday, November 12, 2011

Time for Efficiency Optimization


YIF journey has been a life-changer for each of the Young India Fellows. Especially for our team viSparsh, managing a technical project alongside an intensive liberal arts course has been tough. We have been working many days and nights to quickly complete the 1st phase of our ELM (to build viSparsh belt prototype). However, the demanding coursework and other activities at YIF also consumed effective part of our time. Thus, we finally understood the urgency for effective time management for our project which we were not able to strictly abide to earlier. As per the guidance of our mentor Prof. Rahul Mangharam, to start with we decided of chalking out a plan for our coming two weeks. Given below is the work allocation for our next two weeks:
– Resolving Issues coming with Kinect on Beagleboard(Tushar- 16/11/11)
– Completing all installation work( including OS and Kinect) on PandaBoard (Jatin-17/11/11)
– Talking to organizations like Kritical Solutions and Saksham NGO (suggested by YIFP admin) to get inputs from visually impaired people, knowing about their issues and talking to other people already working in this field. (Rolly-30/11/11)
– Making PCB for the project (Jatin-22/11/11)
– Integrating all components (Tushar-26/11/11)
– Making Initial Project with Belt (Tushar-26/11/11)
– Updating Blog, contacting people and other communications (Rolly- on regular basis)
Alongside, this initial work distribution we would be soon making a proper timeline till the end of our project to help us keep better track of the project and work accordingly to meet the deadlines.

Problem Bubbles Popping Out


It seems we are on the board of ‘Snakes & Ladders’. The time when we were about to complete our phase-I another technical issue has bugged us. The Kinect which was working well so far (as posted earlier) is not getting detected with beagleboard now. 
We are getting the follwing error on running a sample Kinect program-
Error:One or more of the following nodes could not be enumerated.
Device: PrimeSense/SensorKinect/5.0.3.4: The device is not connected!
For the last four days we have been struggling with this problem. We have tried various fixes available on the web but none has worked so far. We have even prepared a new SD card with everything installed from scratch but same error persists. We are trying our level best to find any point we have been unknowingly missing so far. However, this had resulted us in push our submission deadlines also accounting the fact that we have term papers and exams on the pipeline.

Monday, November 7, 2011

Battle with Bugs


The entire last was a battle with bugs and installation problems. The OpenNI website has updated the OpenNI files without updating the Sensor files. It created errors during installation and we were forced to read almost every blog and website on web to debug it. We left no stone unturned and finally managed to install the sensor correctly. Here is a detailed description on installation instructions and debugging. We are trying to put all the problems and there solutions at one place so that it can save others time.

STEP 1: Install required packages for Kinect
We need to install certain packages in Ubuntu. These packages, then, would facilitate the installation of OpenNI and Kinect.

COMMAND:
sudo apt-get install git-core cmake libglut3-dev pkg-config 
sudo apt-get install gcc g++ build-essential libxmu-dev 
sudo apt-get install libxi-dev libusb-1.0-0-dev 
sudo apt-get install doxygen graphviz git

ERROR:
libglut3-dev cannot be installed

FIX:
sudo apt-get install git-core cmake freeglut3-dev pkg-config
sudo apt-get install gcc g++ build-essential libxmu-dev 
sudo apt-get install libxi-dev libusb-1.0-0-dev 
sudo apt-get install doxygen graphviz git
In the latest versions ‘libglut3-dev’ has been replaced by ‘freeglut3-dev’ so install it inplace of ‘libglut3-dev’. In case it does not work, install libglut3 from launchpad.net/ubuntu/natty/+package/libglut3-dev.

Although some blogs tell that ‘doxygen’ and ‘graphviz’ are optional, we found that they are required so don’t skip them while installation.

STEP 2: Create a new directory for Kinect

COMMAND:
mkdir ~/kinect
cd ~/kinect

STEP 3: Download OpenNI from the git repository

COMMAND:
git clone https://github.com/OpenNI/OpenNI.git


STEP 4: Install OpenNI

COMMAND:
cd OpenNI/Platform/Linux-x86/Build
make && sudo make install

ERROR1:
Cannot find the metadata file "system.windows.forms.dll"

FIX1:
sudo apt-get install mono-complete
Although, it is weird that linux throws error of windows dll file, the solution is to install ‘mono’. Mono is a platform for running and developing applications based on the ECMA/ISO Standards.

ERROR2:
No access permission for install.sh and RedistMaker.

FIX2:
The error indicates that the install.sh and RedistMaker files do not have execution permission. Therefore give them permission to execute.
cd ../CreateRedist
sudo chmod +x install.sh RedistMaker
cd ../Build

ERROR3:
CommonMakefile does not exit

FIX3:
sudo apt-get install mono-complete

ERROR 4:
“arm-angstrom-linux-gnueabi” does not exist.
The error is generated as we are on ARM Platform and the Platform.Arm file tries to access gnueabi file for Angstrom (arm-angstrom-linux-gnueabi) which does not exist in Ubuntu.

FIX 4:
cd Common 
mv Platform.Arm Platform.Arm.BAK
cp Platform.x86 Platform.Arm

ERROR 5:
Unrecognized command line option "-malign-double"
Unrecognized command line option "-mmse2"

FIX 5: Edit the Platform.Arm file in ~/kinect/OpenNI/Platform/Linux-x86/Build/Common

In the latest version of OpenNI, the ARM platform has been included by the name Linux-ARM. However if we go into ~/kinect/OpenNI/Platform/Linux-ARM/Build/ and try to build it you’ll face errors thus better would be to use the old Linux-x86 platform files with some modifications.

Open ~/kinect/OpenNI/Platform/Linux-x86/Build/Common/Platform.Arm and ~/kinect/OpenNI/Platform/Linux-x86/Build/Common/Platform.x86 files to comment out these lines:

  CFLAGS += -malign-double
  and
  ifeq ($(SSE_GENERATION), 2)
        CFLAGS += -msse2
  else
        ifeq ($(SSE_GENERATION), 3)
               CFLAGS += -msse3
        else
               ($error "Only SSE2 and SSE3 are supported")
        endif
  endif

We need to do this because ‘ -malign-double’ and ‘-msse3’ flags are valid only for x86 platform and will not work for gcc on ARM. Commenting out these statements makes the code usable for ARM.


STEP 5: Download Kinect driver

Most of the blogs mention https://github.com/boilerbots/Sensor.git to get the Kinect sensor’s files. These files are not updated and generated many errors and bugs. Use https://github.com/avin2/SensorKinect which is the updated version and do not produce these errors.

COMMAND: 
cd ~/kinect/
git clone https://github.com/avin2/SensorKinect
cd SensorKinect


STEP 6: Install Kinect driver
cd Platform/Linux-x86/Build
make && sudo make install

ERROR1:
CommonMakefile does not exit

FIX1:
sudo apt-get install mono-complete
Now try building again. If the method does not work do this-
gedit ~/kinect/SensorKinect/Platform/Linux-x86/Build
Replace ‘LIB_USED’ by ‘USED_LIBS’ and build again. You may find that the name is already ‘USED_LIBS’ in that case follow the below given instructions.

Latest versions of OpenNI have changed the name of the common file as CommonCppMakefile. How most of the Sensor files have not yet been updated. Therefore, execute this command to create a soft link copy of CommonCppMakefile with the name CommonMakefile.
cd /usr/include/ni
ln –s  ./CommonCppMakefile ./CommonMakefile

If it does not work open /usr/include/ni and search for these file-
CommonCppMakefile
CommonDefs.mak
CommonTargets.mak
CommonCSMakefile
CommonJavaMakefile
Platform.CE4100
Platform.x86
Platform.Arm
The OpenNI probably hasn’t created these files. Therefore, you need to copy them here. Execute these commands-
cd ~/kinect/OpenNI/Platform/Linux-x86/Build
cp Common/* /usr/include/ni

STEP 7: Now use your kinect
Connect the kinect with the beagleboard and run sample programs-
cd ~/kinect/OpenNI/Platform/Linux-x86/Bin/Release
./SampleNiRead


ERROR :
InitFromXml failed: Failed to set USB Interface!

FIX :
sudo rmmod gspca_kinect


    

Bootstrap


Here are the instructions to download an image of Ubuntu Maverick and installing it on SD/MMC card. You may choose any Ubuntu flavor as you wish. We have chosen Maverick 10.10 here. Download any other flavor of Ubuntu from here: http://rcn-ee.net/deb/rootfs/maverick/.

STEP 1: Download and extract Ubuntu Maverick 10.10
wget http://rcn-ee.net/deb/rootfs/maverick/ubuntu-10.10-r7-minimal-armel.tar.xz
 
tar xjf ubuntu-10.10-r7-minimal-armel.tar.xz
cd ubuntu-10.10-r7-minimal-armel

STEP 2: Look for the MMC/SD card
sudo ./setup_sdcard.sh --probe-mmc

This would result in something like this-
Are you sure? I Don't see [/dev/idontknow], here is what I do see...
 
fdisk -l:
Disk /dev/sda: 500.1 GB, 500107862016 bytes <-x86 Root Drive
Disk /dev/sdb: 7957 MB, 7957325824 bytes    <-MMC/SD card
 
mount:
/dev/sda1 on / type ext4 (rw,errors=remount-ro,commit=0) <-x86 Root Partition
 Here you can see that /dev/sdb matches your MMC card specification.

STEP 3: Partition the MMC/SD card and install the Ubuntu image
sudo ./setup_sdcard.sh --mmc /dev/sdb --uboot beagle

Version
--uboot option
BeagleBoard Bx
--uboot beagle_bx
BeagleBoard Cx, xM, A/B/C
--uboot beagle
Panda Board
--uboot panda

STEP 4: Insert the MMC/SD card in SD card slot at BeagleBoard, connect the monitor and boot it. Use these login details:
Username: ubuntu 
Password: temppwd

STEP 5: Connect to Internet and install GUI and other necessary packages
sudo dhclient eth0
sudo apt-get update
sudo apt-get install xubuntu-desktop
sudo apt-get install xfce4 gdm xubuntu-gdm-theme xubuntu-artwork 
sudo apt-get install xserver-xorg-video-omap3 network-manager
We also used ‘sudo apt-get upgrade’ in this step 5. It took several hours and resulted into non graphical interface. Therefore, there is no need to do that here. 

STEP 6: Restart the BeagleBoard.


Monday, October 1, 2012

Team viSparsh receives a cash grant of 2 lac INR

We are happy to announce that we have received a grant of  Rs. 2 lakhs as one of the winners of the Economic Times- Power of Ideas 2012 contest.The contest was held in association with Department of Science and Technology and the Centre for Innovation Incubation and Entrepreneurship, IIM Ahmedabad.  Thank you all for the support as we see our idea comes a step closer to the realization.

 

Monday, September 3, 2012

Our Business Idea reach finals of Economic Times-Power of Ideas 2012

As a next step to productize viSparsh and make it reach to the target mass at a subsidized price, Team viSparsh had made a Business Plan and applied in Economic Times- Power of Ideas Contest 2012. This contest being organized in association with Department of Science & Technology (DST) and Indian Institute of Management (IIM), Ahmedabad is India's Largest Entrepreneurship Development Programme.
We are excited as our business idea reached among top 75 ideas from all across India who would stand a chance to win seed funding. The semi-finals were held in Bangalore last month and 75 ideas have been shortlisted from among 504 semi-finalists and thousands of initial applicants.  The team has been provided an opportunity to go through an intensive 10 days mentorship workshop at IIM Ahmedabad to further refine and implement our business idea.
Thank you all who have been continuously supporting us from all around the world to start this next revolution in the field of assistive technologies.

The ET- Power of Ideas 2012 finalist list can be found at :

Thursday, August 16, 2012

viSparsh chosen among the top 12 finalists of 'The Wall Street Journal Asian Innovation Awards 2012'

From among 240+ entries, The Wall Street Journal today announced the top 12 finalists of Asian Innovation Awards 2012. We are thrilled to inform you that project viSparsh has been chosen as one among those finalists. Further, this year three of the top 12 innovation entries are from India compared to none in 2011. This surely indicates that India is making way ahead in terms of innovation and gives us a proud feeling being 'Young India Fellows' :)


Link: http://online.wsj.com/article/SB10000872396390444246904577574422852969682.html?KEYWORDS=asian+innovation+awards

Sunday, August 5, 2012

Team viSparsh meets Tony Scott (CIO, Microsoft)

It was a pleasant Thursday morning for us in the city of Hyderabad, India where Tony Scott ( Chief Information Officer, Microsoft) visited on 2nd August, 2012. Team viSparsh was highly excited on being given an opportunity to demo our project to him along with MD, MSIT India- Mr. Raj Biyani.
As planned, the demo went off well and Tony highly appreciated our rigorous efforts being put in the domain of assitive technology.
The idea of the project also resonated with Microsoft's aim of helping people and businesses throughout the world to actualize their full potential.
Tony also offered to extend his support for helping us achieve our vision of making this technology reach to the people in need.
Thank you Tony and Raj for your encouragement and support.

The way forward

viSparsh is one such project which has a huge potential to change the world, to make the world a better place, comes with vast future scope and infinite feature addition capacity. Thus, right from day one of this project, we have been thinking of ways through which this project can see light of the day by delivering on aspects like usability, design etc.
We recently had a meeting with our Professor at Penn and we discussed that this is the time when we should think of giving our prototype a viable shape.
Thus, if you are a tech design company who can help us solve the aspects like decreasing form-factor, improving on power efficiency and looking at other usuability aspects, we would love to hear from you. Kindly fill the contact form given on the right side bar to connect to us.

Monday, July 9, 2012

Re- set up of the Lab in Hyderabad

First Batch of YIF Programme came to a formal closure on 19th July, 2012. On receiving job offers from Microsoft, the entire team viSparsh has now shifted to Hyderabad, India. Although, viSparsh was only a part of the 8 months long internship programme at YIF, the team now plans to take it forward in the city of Nawabs. Due to the relocation, the entire lab is still in the process of shifting and we soon hope to start our work of making technology reach to the people in need.

We also have received tremendous support from Dr. Pramath Raj Sinha (Founding Dean, ISB) who was present in the town for the YIFP Hyderabad Chapter meet.
       YIFP Hyderabad Chapter Meet in June 2012

Team viSparsh visits Accenture Innovation Labs in France

As part of the winning prize of Accenture Innovation Jockeys Contest (powered by Yahoo), Team viSparsh got an opportunity to visit Accenture Technology Labs in Sophia Antipolis, France between 1st to 3rd July, 2012. The team at France got an opportunity to learn about various next Generation technologies which can be integrated in their project to make it more robust and functional. Thank you Accenture & Yahoo!
Fig. : Team viSparsh in France

 Fig.:  Tushar (team member, viSparsh) explaining project 'viSparsh'


Team viSparsh gets Best ELM Award at YIFP Convocation


Team viSparsh won the best Experiential Learning Module at the Young India Fellowship 1st Convocation. The event was held at JNU Convention Hall, New Delhi on 19th May, 2012. The team is highly excited on receiving this appreciation. The team is further motivated to work more hard and make technologies like these reach to the people in need.

Awesome day at McKinsey Office

As part of our Coaching process at the Young India Fellowship Programme, our Coach -Mr. Abhimanyu Puri invited us to his office on 10th May 2012 in Gurgaon. There we got a pleasant surprise as we got an opportunity to meet various senior leaders at McKinsey. We got many valuable feedbacks and guidance from them both about the personal and professional life. Thanks Abhi :)

Thursday, April 26, 2012

viSparsh wins the title of 'Accenture Innovation Jockeys' (Powered by Yahoo!)



After a series of levels, Team viSparsh has finally won the title of  'Innovation Jockeys 2012' (co-sponsored by Accenture and Yahoo!). The contest started n January 2012 to search for the most innovative minds across campuses in India. viSparsh won from among 1082 entries from 500+ colleges all across India. The finale was held at the Leela Palace, Bangalore on 24th April 2012. As winners, the team has got an opportunity to visit Accenture Innovation Labs at Sophia Antipolis, France in the coming months. The team also won the 'City Category' award.
Further information can be found at the the Accenture Innovation Jockeys Site. 
Team viSparsh winning the Grand Finale of Accenture Innovation Jockeys Contest (Powered by Yahoo!). The Prize being given by Ms. Rekha M.Menon, Executive Director - Geographic Services, India and ASEAN, Accenture


Team viSparsh winning the City Category in the same event.

As part of the Final Round, the team had an insightful interaction with the Jury Members from Yahoo R&D, HP Labs, Accenture Global Team and several others.







News: http://varindia123a.blogspot.in/2012/04/innovation-jockeys-of-2012-awarded.html ,
http://mail.varindia.com/mobile/Accenture_InnovationJockeys2012.htm ,
http://news444.com/a/1067179/innovation_jockeys_of_2012_awarded

viSparsh Day @YIFP

We celebrated  'viSparsh Day' at our Programme Campus on 18th April 2012. The occasion marked our gratitude to the Young India Fellowship Programme and our co-fellows who have helped us in the entire journey of viSparsh. We shared the success by hosting some games and organizing a small party in the campus itself. We wholeheartedly want to thank all of you who have been a constant support of encouragement for team viSparsh. We are highly motivated to keep up the progress work in the future and deliver a viable product in the future. Thank You All :)

Sunday, March 25, 2012

viSparsh Intro

viSparsh Experience at TechEd2012

viSparsh Prototype 2




We have finally been able to come up with the 2nd prototype of the belt as shown in the pic. Although, the size of the belt still looks similar to the previous version, we have done some major technical upgradations in the belt. The belt has now more robust power management system with single ON-OFF button and less components. Previous post provides the list of changes in this new prototype.
After making it functionally robust, our next aim is to decrease the size of the belt.



In the next version, we are trying to make a sleek design by considering various possibilities like replacing bigger PandaBoard with small BeagleBone, removing Kinect's plastic casing to use only the required components etc.
One of our main focus at present is to build a sleek design before adding more functionalities.




viSparsh @ TechEd2012


All ready with the second prototype of the belt, we showcased viSparsh at the TechEd2012 held at Bengaluru from 21st to 23rd March 2012.
TechEd is Microsoft's annual event to showcase state of the art technologies. We were given a booth to demonstrate our prototype.
This second prototype of the belt has following upgradations
- Less Bulky due to replacement of twelve 1.2V batteries with single 3 cell LiPo Battery.
- Single switch On-Off belt
- One major component mbed removed by directly running vibration motor through PandaBoard.
- Voltage Cut-off circuit integrated to prevent battery from damaging.
- Better Haptic feedback using the new vibration motors.

It was a fantastic experience demonstrating our project in such a big platform. The project was highly appreciated by the industry experts and others who visited the event.


Better Haptic Feedback using new vibration motors

In our first prototype of the belt, we were using ROB 08449 buzzers (left side of the image) to give haptic feedback to the visually impaired people.
The vibrations produced by them were perpendicular to axes of the belt and thus got significantly reduced as we taped them inside the belt. Therefore, we needed more powerful vibration motors for our purpose.
We have replaced the current motors with Pico Vibe 9mm Vibration Motor- 25mm type (RHS of the image) as it provides vibrations both in X and Y planes.
As could be seen in the pic, they are slightly bigger in size (25mm length and 8.8mm diameter) compared to 3.4 mm length of ROB-08449 motors. However, both operate at 3V and Pico Vibe produces 13,500rpm speed compared to 12000rpm of ROB-08449 buzzers. This is a significant improvement as our output depends on the efficiency of these vibrations. Further, instead of earlier 6 motors, we now have to use only 3 motors in the belt ( one for each zone- left, centre and right).

mbed Removed

We have finally succeeded in removing one major component from the belt. The initial design of the belt had mbed in it. We were using mbed to generate PWM (Pulse width Modulation) to be sent to the actuators (vibration motors in our case). However, we eventually figured out that the same functionalities can be provided by PandaBoard without incurring that extra component cost of mbed.

PandaBoard has on-board GPIO (General Purpose Input Output) pins that can be used to render multiple functionalities. We configured three GPIO pins as output and created three software threads to generate variable-duty PWM. This PWM further actuates the vibration motors to guide the visually impaired in obstacle-detection.

Removal of mbed is a major breakthrough for us as we are continuously working on decreasing the size of the belt to make a sleek wearable design.

Belt with 'Single Switch ON-OFF' feature


In line with the continuous up-gradations being done in the belt, the next is making the belt start with the press of a single button. The circuit deploys DPDT (Double Pole, Double Throw) switch along with the power cut-off circuit.
It now takes away the burden of making power connections everytime one had to wear the belt.


Belt upgraded with new Power Circuit


Power issues have been bothering us for quite some time. We were earlier using 4*1.2V batteries to power PandaBoard and mbed and 8*1.2 V for Kinect. This had made the design quite bulky and was taking unnecessary space. Thus, we decided to reduce the size by replacing these cells with 2000mAh, 3 cell LiPo battery which gives 11.1V. The pic above shows the integrated LiPo batteries in the belt.
Further, we have implemented the Power Cut Off Circuit to prevent 3 cell LiPo batteries from damaging whenever voltage drops below 9V.

We have built the voltage cut off circuit using details given in RC forum
The pic shows the implemented circuit.

Monday, March 12, 2012

Monday, January 16, 2012

viSparsh: A touch of light

We are continuously fine-tuning the belt according to the feedback obtained from various blind organizations. We have been visiting National Association for Blinds(NAB), Delhi and obtained valuable feedback from Mohd. Wasim.


Mohd. Wasim is blind since his birth. He really appreciated the concept of a haptic belt that makes his hands and ear free and gives a broader perception of obstacles in the path. Here is a brief video describing the concept and working of viSparsh-

Wednesday, January 4, 2012

Test Drive-2: Rigorous Testing

This is the second test drive of viSparsh belt. We prepared an obstacle course in a closed hall and and simulated a special test environment with dummies and different sort of obstacles. The obstacles were of different materials, heights and widths. The test results were quite encouraging.


There has been some significant changes from our last test drive- 
1. We have increased number of vibrators from three to six to give a better indication of obstacle to the user. 
2. An autorun script has been created and deployed which starts the viSparsh belt automatically when we switch it on.  


Here is a video snippet of this test drive-


Tuesday, December 6, 2011

Autorun

Although we finished the first phase of viSparsh. There were certain fine tunings required to make the belt wearable. The belt stills needed to be started by executing a file on ubuntu-terminal. To resolve that we needed an autorun file which automatically runs at startup and kickstarts the belt. This can be done by creating an initialization daemon. We have created this daemon and now the belt can run on its own. The detailed process is given below- 


STEP1: Create the autorun script
Create a new file 'auto_visparsh' at Desktop with the commands starting the viSparsh belt as content. Copy following command in this file. 

COMMAND:
cd /root/OpenNI/Platform/Linux-x86/Bin/Release
./SimpleRead.net.exe


STEP2: Copy this autorun file to /etc/init.d 
/etc/init.d contains all scripts which are run on startup and shutdown automatically. The '.d' in 'init.d' represents 'daemon'. Daemon are the processes which are executed in background.

COMMAND:
cd /etc/init.d
cp /home/visparsh/Desktop/auto_visparsh /etc/init.d


STEP3: Choose the run-level
Create a 'soft-link' of the autorun file in one of the run-level directory {rc0.d, rc1.d rc2.d rc3.d rc4.d, rc5.d, rc6.d}. For our purpose choose 'rc2.d'. Don't use 'rc0','rc1' and 'rc6' they contain shutdown scripts. We can also set priority of the process using 'SXX' where XX represents the priority {00<=XX<=99}. The letter 'S' represents 'Start' and can be replaced by 'K' to deactivate a script.   

COMMAND:
cd /etc/rc2.d
ln -s ../init.d/auto_visparsh S99auto_visparsh

STEP4: Restart
Now restart the board. This time belt will start automatically.

Tuesday, November 29, 2011

Test Drive-1: GET, SET……… GO!!!!!!!!!!


After the hard work of one and a half month finally we finished first phase of our project. We have designed the viSparsh v1.0. The belt includes a plastic box which contains PandaBoard, Mbed and Voltage Convertor and the Kinect fixed at the front of the belt. Currently we are using three vibration motors on three sides which we plan to increase to six in version 1.1.

We had our test drive today in our hostel and it worked exceptionally well with no false alarm. Some of the other fellows helped us in shooting the video and becoming the obstacles. Here is the video for this test drive-






Power Supply Issues

After powering the circuit from AC mains, it was time to switch on to batteries as the viSparsh belt would be mobile. So, we purchased a Chinese 12V 2100mAh battery pack to run Kinect and used 7805 voltage regulator to power mbed and PandaBoard with 5V. The batteries were drained out in a couple of minutes and situation got worst after recharging it. Hence, this Chinese battery proved useless for us.
We, then, used 8 AA size Kodak 2100 mAh rechargeable cells. Though it powered our setup for 15 odd minutes but not what we required. Then we got 8 more to power Kinect separately. Still, the power dissipation from 7805 was huge and we thought why not to eliminate that. As one cell is rated 1.2V (though it measures between 1.3-1.4), so we connect 4 cells in series to draw slightly more than 5V from it.
Here is the final number of cells we have used in the project:

1. Mbed:                  4 AA Size 1.2V 1000mAh          Output: Slightly more than 5V
2. PandaBoard:        4 AA Size 1.2V 2100mAh          Output: Slightly more than 5V
3. Kinect                  8 AA Size 1.2V 2500mAh          Output: Slightly more than 10V

Here, we were able to eliminate 7805 voltage regulator from our circuit and hence we minimized the power dissipation.We tested the belt for around half an hour and it worked perfectly with this configuration.
We may add 4 more cells in parallel to power PandaBoard.

Another option we may consider is to order a customized battery with our specifications.

Booting Ubuntu on Pandaboard EA3

STEP1: Download the Ubuntu image
Use this link to download the pre-installed image of ubuntu 10.10. 

STEP2: Prepare the SD card
Insert the SD card in your PC and unmount it. Then use this command to copy the pre-built image on the card.

COMMAND:
sudo sh -c 'zcat ./ubuntu-netbook-10.10-preinstalled-netbook-armel+omap4.img.gz|dd bs=4M of=/dev/sdb; sync'

STEP3: Download and copy required u-boot and MLO files
For Pandaboard version A2 and later follow these instructions. In our case we had a EA3 version so we followed it.  

2.      Untar with "tar -jxf panda.tar.bz2"
3.      Mount the first partition of the imaged SD card
4.      Copy MLO and u-boot.bin (extracted from the tar file) to the mounted partition.

STEP4: Boot from Pandaboard
Now unmounts the SD card and insert it in Pandaboard and give it the power. It shall start booting.

STEP5: Now use the same procedure to install OpenNI and Kinect that we used for BeagleBoard.



Saturday, November 26, 2011

Getting ready to release the 1st version




Huhuuu! After facing lots of troubles at every step, we were finally able to get the project working by seamlessly integrating all the components. The video above shows as to how Kinect is able to detect an object (human in our case) and give an alarm depending on the distance between the human and Kinect. In our case, we have used three vibration motors for producing these alert signals. Each motor is used to send an alert for obstacle in one particular direction namely, left, right or middle. These three motors would help a visually impaired person in identifying the direction in which the obstacle is present. We hope to transfer this functionality on the belt in a day or two.This would mark the end of first phase of our project where we were supposed to make a working prototype of the viSparsh Belt.

Wednesday, November 23, 2011

Solving the Response Time Problem


The response time of the data transmitted by the Kinect to the BeagleBoard came to around 4-5 seconds which was quite contrary to our expectations. By 'Response Time' we mean the 'time lag' between two samples of data transmitted by the Kinect to the BeagleBoard.

In order to solve this problem we tried running the same on the PandaBoard and Hurray! the problem got solved. Response time decreased to a few milliseconds. Though both have the 1Ghz processor but BeagleBoard has 512MB DDR RAM whereas PandaBoard has 1GB DDR2 RAM.

Active USB Hub: The way out for our Kinect Detection Problem



To our utter surprise, Kinect stopped working all of a sudden. On running Kinect sample programs, it was not able to detect the device as mentioned in one of our previous posts. However few days back, we were able to find the solution for this. After trying all possible combinations like the software error possibilities etc., we were finally left with trying out the Active USB Hub option. And Hurray!, it actually worked. The problem was that although Kinect runs on a 12V adapter , it still draws ample amount of current from the USB port to which it is connected. Thus, while connecting Kinect to BeagleBoard/PandaBoard, Kinect was not able to draw to draw sufficient amount of current from the board. Thus, on connecting Kinect through the Active USB Hub instead of the board directly, we were able to remove Kinect detection problem.

Saturday, November 12, 2011

Time for Efficiency Optimization


YIF journey has been a life-changer for each of the Young India Fellows. Especially for our team viSparsh, managing a technical project alongside an intensive liberal arts course has been tough. We have been working many days and nights to quickly complete the 1st phase of our ELM (to build viSparsh belt prototype). However, the demanding coursework and other activities at YIF also consumed effective part of our time. Thus, we finally understood the urgency for effective time management for our project which we were not able to strictly abide to earlier. As per the guidance of our mentor Prof. Rahul Mangharam, to start with we decided of chalking out a plan for our coming two weeks. Given below is the work allocation for our next two weeks:
– Resolving Issues coming with Kinect on Beagleboard(Tushar- 16/11/11)
– Completing all installation work( including OS and Kinect) on PandaBoard (Jatin-17/11/11)
– Talking to organizations like Kritical Solutions and Saksham NGO (suggested by YIFP admin) to get inputs from visually impaired people, knowing about their issues and talking to other people already working in this field. (Rolly-30/11/11)
– Making PCB for the project (Jatin-22/11/11)
– Integrating all components (Tushar-26/11/11)
– Making Initial Project with Belt (Tushar-26/11/11)
– Updating Blog, contacting people and other communications (Rolly- on regular basis)
Alongside, this initial work distribution we would be soon making a proper timeline till the end of our project to help us keep better track of the project and work accordingly to meet the deadlines.

Problem Bubbles Popping Out


It seems we are on the board of ‘Snakes & Ladders’. The time when we were about to complete our phase-I another technical issue has bugged us. The Kinect which was working well so far (as posted earlier) is not getting detected with beagleboard now. 
We are getting the follwing error on running a sample Kinect program-
Error:One or more of the following nodes could not be enumerated.
Device: PrimeSense/SensorKinect/5.0.3.4: The device is not connected!
For the last four days we have been struggling with this problem. We have tried various fixes available on the web but none has worked so far. We have even prepared a new SD card with everything installed from scratch but same error persists. We are trying our level best to find any point we have been unknowingly missing so far. However, this had resulted us in push our submission deadlines also accounting the fact that we have term papers and exams on the pipeline.

Monday, November 7, 2011

Battle with Bugs


The entire last was a battle with bugs and installation problems. The OpenNI website has updated the OpenNI files without updating the Sensor files. It created errors during installation and we were forced to read almost every blog and website on web to debug it. We left no stone unturned and finally managed to install the sensor correctly. Here is a detailed description on installation instructions and debugging. We are trying to put all the problems and there solutions at one place so that it can save others time.

STEP 1: Install required packages for Kinect
We need to install certain packages in Ubuntu. These packages, then, would facilitate the installation of OpenNI and Kinect.

COMMAND:
sudo apt-get install git-core cmake libglut3-dev pkg-config 
sudo apt-get install gcc g++ build-essential libxmu-dev 
sudo apt-get install libxi-dev libusb-1.0-0-dev 
sudo apt-get install doxygen graphviz git

ERROR:
libglut3-dev cannot be installed

FIX:
sudo apt-get install git-core cmake freeglut3-dev pkg-config
sudo apt-get install gcc g++ build-essential libxmu-dev 
sudo apt-get install libxi-dev libusb-1.0-0-dev 
sudo apt-get install doxygen graphviz git
In the latest versions ‘libglut3-dev’ has been replaced by ‘freeglut3-dev’ so install it inplace of ‘libglut3-dev’. In case it does not work, install libglut3 from launchpad.net/ubuntu/natty/+package/libglut3-dev.

Although some blogs tell that ‘doxygen’ and ‘graphviz’ are optional, we found that they are required so don’t skip them while installation.

STEP 2: Create a new directory for Kinect

COMMAND:
mkdir ~/kinect
cd ~/kinect

STEP 3: Download OpenNI from the git repository

COMMAND:
git clone https://github.com/OpenNI/OpenNI.git


STEP 4: Install OpenNI

COMMAND:
cd OpenNI/Platform/Linux-x86/Build
make && sudo make install

ERROR1:
Cannot find the metadata file "system.windows.forms.dll"

FIX1:
sudo apt-get install mono-complete
Although, it is weird that linux throws error of windows dll file, the solution is to install ‘mono’. Mono is a platform for running and developing applications based on the ECMA/ISO Standards.

ERROR2:
No access permission for install.sh and RedistMaker.

FIX2:
The error indicates that the install.sh and RedistMaker files do not have execution permission. Therefore give them permission to execute.
cd ../CreateRedist
sudo chmod +x install.sh RedistMaker
cd ../Build

ERROR3:
CommonMakefile does not exit

FIX3:
sudo apt-get install mono-complete

ERROR 4:
“arm-angstrom-linux-gnueabi” does not exist.
The error is generated as we are on ARM Platform and the Platform.Arm file tries to access gnueabi file for Angstrom (arm-angstrom-linux-gnueabi) which does not exist in Ubuntu.

FIX 4:
cd Common 
mv Platform.Arm Platform.Arm.BAK
cp Platform.x86 Platform.Arm

ERROR 5:
Unrecognized command line option "-malign-double"
Unrecognized command line option "-mmse2"

FIX 5: Edit the Platform.Arm file in ~/kinect/OpenNI/Platform/Linux-x86/Build/Common

In the latest version of OpenNI, the ARM platform has been included by the name Linux-ARM. However if we go into ~/kinect/OpenNI/Platform/Linux-ARM/Build/ and try to build it you’ll face errors thus better would be to use the old Linux-x86 platform files with some modifications.

Open ~/kinect/OpenNI/Platform/Linux-x86/Build/Common/Platform.Arm and ~/kinect/OpenNI/Platform/Linux-x86/Build/Common/Platform.x86 files to comment out these lines:

  CFLAGS += -malign-double
  and
  ifeq ($(SSE_GENERATION), 2)
        CFLAGS += -msse2
  else
        ifeq ($(SSE_GENERATION), 3)
               CFLAGS += -msse3
        else
               ($error "Only SSE2 and SSE3 are supported")
        endif
  endif

We need to do this because ‘ -malign-double’ and ‘-msse3’ flags are valid only for x86 platform and will not work for gcc on ARM. Commenting out these statements makes the code usable for ARM.


STEP 5: Download Kinect driver

Most of the blogs mention https://github.com/boilerbots/Sensor.git to get the Kinect sensor’s files. These files are not updated and generated many errors and bugs. Use https://github.com/avin2/SensorKinect which is the updated version and do not produce these errors.

COMMAND: 
cd ~/kinect/
git clone https://github.com/avin2/SensorKinect
cd SensorKinect


STEP 6: Install Kinect driver
cd Platform/Linux-x86/Build
make && sudo make install

ERROR1:
CommonMakefile does not exit

FIX1:
sudo apt-get install mono-complete
Now try building again. If the method does not work do this-
gedit ~/kinect/SensorKinect/Platform/Linux-x86/Build
Replace ‘LIB_USED’ by ‘USED_LIBS’ and build again. You may find that the name is already ‘USED_LIBS’ in that case follow the below given instructions.

Latest versions of OpenNI have changed the name of the common file as CommonCppMakefile. How most of the Sensor files have not yet been updated. Therefore, execute this command to create a soft link copy of CommonCppMakefile with the name CommonMakefile.
cd /usr/include/ni
ln –s  ./CommonCppMakefile ./CommonMakefile

If it does not work open /usr/include/ni and search for these file-
CommonCppMakefile
CommonDefs.mak
CommonTargets.mak
CommonCSMakefile
CommonJavaMakefile
Platform.CE4100
Platform.x86
Platform.Arm
The OpenNI probably hasn’t created these files. Therefore, you need to copy them here. Execute these commands-
cd ~/kinect/OpenNI/Platform/Linux-x86/Build
cp Common/* /usr/include/ni

STEP 7: Now use your kinect
Connect the kinect with the beagleboard and run sample programs-
cd ~/kinect/OpenNI/Platform/Linux-x86/Bin/Release
./SampleNiRead


ERROR :
InitFromXml failed: Failed to set USB Interface!

FIX :
sudo rmmod gspca_kinect


    

Bootstrap


Here are the instructions to download an image of Ubuntu Maverick and installing it on SD/MMC card. You may choose any Ubuntu flavor as you wish. We have chosen Maverick 10.10 here. Download any other flavor of Ubuntu from here: http://rcn-ee.net/deb/rootfs/maverick/.

STEP 1: Download and extract Ubuntu Maverick 10.10
wget http://rcn-ee.net/deb/rootfs/maverick/ubuntu-10.10-r7-minimal-armel.tar.xz
 
tar xjf ubuntu-10.10-r7-minimal-armel.tar.xz
cd ubuntu-10.10-r7-minimal-armel

STEP 2: Look for the MMC/SD card
sudo ./setup_sdcard.sh --probe-mmc

This would result in something like this-
Are you sure? I Don't see [/dev/idontknow], here is what I do see...
 
fdisk -l:
Disk /dev/sda: 500.1 GB, 500107862016 bytes <-x86 Root Drive
Disk /dev/sdb: 7957 MB, 7957325824 bytes    <-MMC/SD card
 
mount:
/dev/sda1 on / type ext4 (rw,errors=remount-ro,commit=0) <-x86 Root Partition
 Here you can see that /dev/sdb matches your MMC card specification.

STEP 3: Partition the MMC/SD card and install the Ubuntu image
sudo ./setup_sdcard.sh --mmc /dev/sdb --uboot beagle

Version
--uboot option
BeagleBoard Bx
--uboot beagle_bx
BeagleBoard Cx, xM, A/B/C
--uboot beagle
Panda Board
--uboot panda

STEP 4: Insert the MMC/SD card in SD card slot at BeagleBoard, connect the monitor and boot it. Use these login details:
Username: ubuntu 
Password: temppwd

STEP 5: Connect to Internet and install GUI and other necessary packages
sudo dhclient eth0
sudo apt-get update
sudo apt-get install xubuntu-desktop
sudo apt-get install xfce4 gdm xubuntu-gdm-theme xubuntu-artwork 
sudo apt-get install xserver-xorg-video-omap3 network-manager
We also used ‘sudo apt-get upgrade’ in this step 5. It took several hours and resulted into non graphical interface. Therefore, there is no need to do that here. 

STEP 6: Restart the BeagleBoard.