The complete sourcecode for this blogpost can be found in my GitHub. It is based on the Angular CLI and Bootstrap 4.
The goal of this blogpost or this private project I did was to consume the face recognition API sending a picture to it which you can capture with the camera of your laptop/ computer and then to analyze it and siaply the results.
we do fire a request to an endpoint with specific parameters, headers and a body which is an object with a source URL as a value of a property called “data”. So we can easily do that with angular as well like:
But if we take a picture with a service from the camera we do not save it and have it as a image directly but instead we have a base64 representation of this image.
so the challenge here was to not send the URL in the body to the face api but taking the base64 image representation. We can send blobs to an API which is not difficult through the new HttpClient Angular provides us. I tried and searched a bit and found the SO answers which I shared in the “links” section at the end of this article. I modified them a bit and covered them in a service so this method here takes care of generating the correct blob:
As we have the headers, the parameters and the body now we can set up a simple http call to the API with angular passing the subscriptionkey and the base64 representation of the image like:
Having that set up let’s take a look at the response we get back from the API.
That’s a lot of information we get back as JSON. So we can easily cast it in our Typescript object to work with it. So we can ask the camera service to get the photo, use a switchmap to ask the facerecognitionservice to work with the data and give back the result.
We can use this response then to pass and display it in a table format like