Friday 11 April 2008

11/04/08 - Great Success

Yeah, it's been a while since my last post.. but I said I'd keep updating, so here we are.

Since my last post, I've been working on ASEDIT and meeting with Graham in the Ubisense area in DCU so that we can test everything. The plan was to take a load of measurements and metrics, but... there were problems, as always!
The problem that I spent the most time on was the "region editor" mode of ASEDIT. This mode alows the user to draw a shape which represents the Ubisense equipped region of a room - the region where the application is to run - and to assign Ubisense coordinates to each corner. The idea was that the coordinates from Ubisense could be converted to screen coordinates, so that I can draw the users location accurately within the shape representing the room. Also, when a sound source is placed in the room, the position of the sound source needs to be converted from screen coordinates to Ubisense coordinates.
Well.. I thought this would be straightforward enough, but after AGES of struggling I gave up and ecided on a simpler version - instead of drawing a potentially odd shape, you can now only draw a rectangle and also, it is now guranteed that the corners are in a certain order (eg, corner 1 is always the top-lefthand corner). Eventually I got it working. Mostly, I think I was making little stupid mistakes, but this monring I managed to get it working pretty much perfectly!

Now you can use the editor to place sounds in the room and watch the user move around. It all works.
Myself and Graham tested it and we got one of the other guys to test it too, as he was passing by. All three of us were able to successfully locate the position of the sound and walk up to it resonably easily. Success! What does this mean? That our project works! Well, we knew that already, but now theres no doubts ;-)

Here is the coordinate system conversion code I wrote (to any Python programmer out there, I know it's horrible Python code, not all my code is like this!):
def _convert(self, x, y, to_screen):

ubi = self.get_ubisense_coords_cb()
scr = self._region_corners

u_left = ubi[0][0]
u_right = ubi[1][0]
u_top = ubi[0][1]
u_bottom = ubi[3][1]

s_left = scr[0][0]
s_right = scr[1][0]
s_top = scr[0][1]
s_bottom = scr[3][1]

left = x - u_left
top = y - u_top
sel_a, sel_b = max, min
if to_screen:
offset_x, offset_y = s_left, s_top
else:
offset_x, offset_y = u_left, u_top
sel_a, sel_b = sel_b, sel_a
left = x - s_left
top = y - s_top

if u_top > u_bottom:
if to_screen:
top = u_top - y
else:
offset_y *= -1
u_top, u_bottom = u_bottom, u_top
if u_left > u_right:
if to_screen:
left = u_left - x
else:
offset_x *= -1
u_left, u_right = u_right, u_left

u_width = u_right - u_left
u_height = u_bottom - u_top

s_width = s_right - s_left
s_height = s_bottom - s_top

x_ratio = sel_a([u_width, s_width]) / sel_b([u_width, s_width])
y_ratio = sel_a([u_height, s_height]) / sel_b([u_height, s_height])

nx = left * x_ratio
ny = top * y_ratio

return (abs(nx + offset_x), abs(ny + offset_y))
If to_screen is True, then the x and y are assumed to represent a point in Ubisense space and the function will return them in screen space, otherwise a point in screen space is assumed and it is returned in Ubisense space.

Other things have happened too - I'll write an entry about them later.

No comments: