Figure 1. Cylinder shape + Flooded Contour. The surface is generated using Huang, Menq Approach explained here
Just few minutes ago, we managed to output the above image using an incremental approach. Thanks Allah.
We had to think of a way to project 3D Points onto an estimated plane so that we can measure the angle easily.
We had two ideas (Actually the first is our idea, the second is from a online document):
That's all till now..
Alsalam alikom wa ra7mat Allah wa barkatoh
Just few minutes ago, we managed to output the above image using an incremental approach. Thanks Allah.
So, what's the progress so far?Let's get into the topic... the "best point" in Huang Menq approach we explained in the previous post has three criteria:
- Robotics APIs (Zezo): Managed to move the robot in an arbitrary direction.
- Motion Planning (Kisho & Moussa): Finished their first approach, working on A* (for shortest route).
- 3D Model Construction (Mustafa & Me): Just managed to output acceptable models of the Incremental Algorithm (after the Brutal Force approach)
- It must fall in the K-Nearest Neighbors of both end points of the edge.
Just get the K-NN for one end point -> List1
then use this list to get the K-NN for the other end point. - It must fall within the area formed by neighboring edges.
p is the boundary vertex. Any candidate to p must fall within the range defined by that semi-circle. - It must form the biggest angle to the boundary vertex.
We had to think of a way to project 3D Points onto an estimated plane so that we can measure the angle easily.
We had two ideas (Actually the first is our idea, the second is from a online document):
- Given a vertex v, and a plane P (Normal: N, Centroid: C). The procedure is as follows:
vec = (v - C) -> gets origin based vector for v
norm1 = (N*vec) -> gets a normal vector to the plane that contains N & vec (norm1 will definitely fall in P)
result = (norm1 * N) -> gets a normal to the plane that contains N & norm1 (which now falls in P & the plane that contains vec & N)
This method we are not sure about it... so just don't use it - result = N * (N.DotMultiplyPositionVector(vec)) * -1;
Explained here
That's all till now..
Alsalam alikom wa ra7mat Allah wa barkatoh
3 comments:
Thank you for all the nice information. I'm still confused of one thing, are you using the robot processor to do the calculations and give the results or is he doing the acutal travel in the space?
Robot processor is not used except for listening to sensors, listening for the computer commands on its BlueTooth port, and of course doing these commands...
All the calculations are done on the computer...
We are using Microsoft Robotics Studio, it's a new application by Microsoft... http://microsoft.com/robotics
Hope that was useful
Good for people to know.
Post a Comment