Hello,
Our product is built around Augmented Reality.
The final goal is real time AR. As for that we first need to migrate/optimise our AR libraries so they can be run on phone devices at an acceptable frame rate, we first decided to start by a simplified version which should not necessarily work on iOS devices. It should include the following steps:
- Normal camera video preview
- User action: preview freeze + passing a parameter from JS to UNO
- Converting the data[] into a jpeg or png image
- Posting the image+the parameter into our server for AR processing while the app waits
- Retrieve the processed image from the server
- Display it.
- Following the next user’s action, the processed image becomes invisible and we go back to 1.
So far, we know how to achieve all the steps with native Android code inside UNO (very likely not in the best possible/clean way) except step 6 and need your help (intensive) on that one and, if possible, advice on the others to implement them “properly” (our coding expertise/time is quite limited so far).
We are using the CameraPanel project for steps 1 and 2 (we stop the preview and get the image data by using their method “TakePicture”).
As indicated at the beginning, the final goal is real time preview. Meaning that at every new frame from the video preview should be processed by our AR library and then rendered. So not rendering the normal video preview but only the processed frames. As before, we have no clue on how to achieve this.
Thanks in advance for the support,
Marc