Fifth week
The first idea of app is on Figure 1, and for it to work, the imaq.VideoDevice command would be necessary to show the 3D cloud and the cloud gradient but since it was only for show and it's properties could not be changed, both images were removed.
Figure 1 - First stages of the app
Since it would need a way to connect to the arduino, send the mesh coordinates and do the necessary calibrations, more buttons would need to be added on the future.
It was also found a way to control the height of the camera. It was found that it only works when it is used to set the depth camera properties, since doing it on the RGB camera would result in an error.
By using the app, it was noticed that the control of zones was giving bad results. The figure 2 was made using surf command, so it does not show the gradient between height, it shows yellow colour represents the highest height and the blue the lowest.
Figure 2 - Zones.
The problem with the gradient was that even thou the surface shape could be seen in figure 2 very well, the result of the function gradient is on figure 3.
Figure 3 - Gradient function
With these results, the first idea to separate the image based on its gradient, and then threat each zone as an object would work but it would take a lot of time to calculate and the result, possibly would not be as expected.
Also, the mesh processing was now done using exterior points based on the canny filter and the result was better.
Figure 4 - Mesh using canny to find the contours
By testing it on the app, it was found that the result it produced were much cleaner on the contours than the function used before, but, it is just in 2D.