Active Topics

 


Reply
Thread Tools
int_ua's Avatar
Posts: 676 | Thanked: 1,067 times | Joined on Jul 2010 @ Kyiv, Ukraine
#11
Can someone with admin rights move this thread to "Development"?
 
int_ua's Avatar
Posts: 676 | Thanked: 1,067 times | Joined on Jul 2010 @ Kyiv, Ukraine
#12
Two-dimensional relative rotation is almost ready. Adding blender-specific actions.

And only when I was writing python class for converting accelerometer data into rotations I realised that for full 3d rotations we need compass or two highly accurate accelerometers...
 
Posts: 840 | Thanked: 823 times | Joined on Nov 2009
#13
Nice project, keep up the good work. Just a slightly odd idea but has anybody tried to track a lit IR port on the n900 with a webcam along with the accelerometers? it might be a good way of mimicking a Playstation Move for absolute translation and rotation.
 

The Following User Says Thank You to Cue For This Useful Post:
lcuk's Avatar
Posts: 1,635 | Thanked: 1,816 times | Joined on Apr 2008 @ Manchester, England
#14
Originally Posted by int_ua View Post
Both Blender and Maemo have python.
So it is potentially possible and potentially not so hard to make N900 3d blender controller.
All we need is python modules for:
0. Recording accelerometers data on N900;
1. Streaming accelerometers data live to desktop and caching it real-time;
2. Controlling active blender view;

I suppose it will be:
a) active only when holding some button (camera for example), and
b) turning blender view relative, not absolute.
onedotzero application uses the opensoundcontrol library to transmit data about what it is doing in the different interactions.
if you could listen from the desktop using the osc bindings you would not need to create a new app
__________________
liqbase sketching the future.
like what i say? hit the Thanks, thanks!
twitter.com/lcuk
 

The Following User Says Thank You to lcuk For This Useful Post:
Posts: 840 | Thanked: 823 times | Joined on Nov 2009
#15
Originally Posted by int_ua View Post
Two-dimensional relative rotation is almost ready. Adding blender-specific actions.

And only when I was writing python class for converting accelerometer data into rotations I realised that for full 3d rotations we need compass or two highly accurate accelerometers...
Just realised how small and recessed the IR LED is. would probably be impossible to track depth with a webcam using the IR port; but what about the screen? You could use the screen with a specific colour (flashlight app maybe) then track the corners. http://www.roninworks.com/?p=14
Better yet it might be better to track 3 circles of different colour set in the corners of a triangle on the screen. rotation and translation could then be tracked with the accelerometers used only for stability. Your rotation would be limited since the screen has to be visible to the webcam but it should be good enough.
 
int_ua's Avatar
Posts: 676 | Thanked: 1,067 times | Joined on Jul 2010 @ Kyiv, Ukraine
#16
I've invented (can't find better English word for this, requesting your help :] ) two possible ways of turning control (besides switching side inclination to control rotation):
1) Use swipes on screen to rotate view.
2) Analyze image from front camera and track lights source if there is only one.
Any other ideas?

And about actual scripting.
There is a bug in python3's socketserver which restricts tranferring data (I will post error later, if someone is interested) and at the moment server is only working in blender 2.49b. But I know how to control view only is blender 2.54
Searching a way.
Any help is still appreciated.

And third problem (not directly, but related).
How can I find out what's wrong if package failed to compile in Extras autocompiler? Can't find any errors in log. (project is "opensubtitles")
 

The Following User Says Thank You to int_ua For This Useful Post:
int_ua's Avatar
Posts: 676 | Thanked: 1,067 times | Joined on Jul 2010 @ Kyiv, Ukraine
#17
Originally Posted by Cue View Post
it might be better to track 3 circles of different colour set in the corners of a triangle on the screen. rotation and translation could then be tracked with the accelerometers used only for stability. Your rotation would be limited since the screen has to be visible to the webcam but it should be good enough.
Interesting, but above my programming skills yet... Maybe someday... )
Originally Posted by lcuk View Post
onedotzero application uses the opensoundcontrol library
Can't launch onedotzero
Code:
$ onedotzero-run.sh 
sudo: /usr/bin/liqbase-playground-cpu-performance: command not found
./onedotzero: error while loading shared libraries: liblo.so.0: cannot open shared object file: No such file or directory
sudo: /usr/bin/liqbase-playground-cpu-ondemand: command not found
libliqbase1 is installed...
 
int_ua's Avatar
Posts: 676 | Thanked: 1,067 times | Joined on Jul 2010 @ Kyiv, Ukraine
#18
So. After a huge delay I'm back.
I've decided to try to finish this concept.
Already improved the basic transport code, two things left:
- unpuzzle Blender View3D rotation schema and make it work;
- create a basic GUI.

After that I will publish it. Hope it will take no more than a week or two of my spare evenings.
 

The Following 2 Users Say Thank You to int_ua For This Useful Post:
cutehunk04's Avatar
Posts: 472 | Thanked: 195 times | Joined on Jun 2010 @ India, Mumbai
#19
eagerly waiting for it ...
__________________
Knowledge is knowing a tomato is a fruit; Wisdom is not putting it in a fruit salad
 

The Following User Says Thank You to cutehunk04 For This Useful Post:
int_ua's Avatar
Posts: 676 | Thanked: 1,067 times | Joined on Jul 2010 @ Kyiv, Ukraine
#20
okay, It doesn't have GUI yet, but working almost as I expected. Will upload into devel today, I hope.
 

The Following 3 Users Say Thank You to int_ua For This Useful Post:
Reply


 
Forum Jump


All times are GMT. The time now is 19:44.