# Application Architecture

**Overall Architecture**

![](https://811030052-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FLe89Gzl6JNQ9inpB8sRg%2Fuploads%2F8J6Zif8ej77s1PWZj6iv%2Fsaral_v1.5.3_architecture-Architecture-View-Point-1.5.jpg?alt=media\&token=81a34733-c024-40f5-8372-a70bc0980884)

* [ ] Refer [`CQube`](https://cqube.sunbird.org)for documentation
* [ ] Saral uses [Tensorflow Lite](https://www.tensorflow.org/lite) AI/ML model embedded within Android Application for predicting Handwritten digits.
* [ ] Each layout is configurable in the backend as JSON. Refer [layout-specification](https://saral.sunbird.org/learn/specifications/layout-specification "mention") and[layout-configuration](https://saral.sunbird.org/use/layout-configuration "mention") for more details.

**Saral SDK and Application Architecture**

![](https://811030052-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FLe89Gzl6JNQ9inpB8sRg%2Fuploads%2FCbpuNcpAlWzBh3gi08MS%2Fsaral_v1.0_architecture2-Saral%20SDKnew.jpg?alt=media\&token=d4bfa5f7-ea1b-4b6a-96f4-930f76ba236d)

* [ ] [**Saral SDK**](https://saral.sunbird.org/engage/saral-sdk-source-code-repository) is an Android and React Native Software Development kit with core logic to predict  handwritten digits and OMR bubbles.
* [ ] [**Saral SDK**](https://saral.sunbird.org/engage/saral-sdk-source-code-repository) accepts layout JSON as input and enriches the JSON with predictions and sends back the response. Refer [layout-specification](https://saral.sunbird.org/learn/specifications/layout-specification "mention")
* [ ] [**Saral SDK**](https://saral.sunbird.org/engage/saral-sdk-source-code-repository) component can be reused for Android and React Native App development with handwritten digits and OMR bubbles prediction capabilities.

**Handwritten Digits ML Model**

Handwritten Digits Machine Learning model is build using Python,Keras ,Tensor-flow technologies.

This Machine Learning model is built on [Resnet164](https://arxiv.org/abs/1603.05027) Architecture.

To embed this model in Android SDK/Application , its converted from HDF5 to TFLite format.

Model is trained on MNIST Handwritten digits open data-set along with handwritten digits from the field.

[OpenCV](https://opencv.org/) is used for capturing ROI’s(Region Of Interest) and processing images before passing them to ML model for prediction.

![](https://811030052-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FLe89Gzl6JNQ9inpB8sRg%2Fuploads%2FTSSA9INOXRmzSl5ileeH%2Fimage.gif?alt=media\&token=041f73e9-dc4f-4857-9b47-66e127edbc51)

**OMR(Optical Mark Recognition) Detection**

For OMR detection [Saral SDK](https://saral.sunbird.org/engage/saral-sdk-source-code-repository) uses OpenCV Computer Vision Technology to capture answer sheet images , sub-divide them into individual ROI(Region of Interest Images).

Individual ROI(Region Of Interest) images of answer sheet are processed using [OTSU thresholding](https://learnopencv.com/otsu-thresholding-with-opencv/) in OpenCV and predict if OMR bubble is filled or unfilled using pixel count.

![](https://811030052-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FLe89Gzl6JNQ9inpB8sRg%2Fuploads%2FLlClFc4fVwKNZUZYkyY2%2Fimage.gif?alt=media\&token=6cb3dc39-2885-4286-ab72-2235b51a4b08)
