📄
Sunbird Saral
  • Sunbird Saral Overview
  • Saral Quick Guide
  • Saral Implementation Manual
    • OMR led scanning - Assessments
    • OCR led scanning - Admissions
  • Saral Transformation Story
  • LEARN
    • Software Requirement
    • Application Architecture
    • Features
      • Configurable Branding
      • Capture AI/ML Training Data
      • Support
      • Share App data
      • Auto Sync
      • Multi-Page support
      • Profile Menu
      • Dynamic Validations
      • Dynamic Tagging
      • Minimal Mode
      • Offline mode
      • App Force Update
      • Review results/marks
      • Firebase Analytics and Crashlytics
      • ML model deployment using Firebase
      • Improved Low light Performance - Manual Edit
      • Vertical Forms Scanning Support
      • Improve Processing Speed for big layouts
      • Admissions Data Capture
      • Securing PII Data Capture - Admissions
    • Specifications
      • Layout specification
      • Backend API Swagger Doc
    • Videos
      • Feature Explanation
        • OMR Layout scanning
        • Auto-Sync
        • Share scan app data
        • Skip feature
        • Support feature
        • Validation feature
        • Incorrect scanning
        • Multi-page feature
        • Branding feature
        • Offline mode
        • Review results/marks
      • Usage by States
        • Gujarat Implementation - Between 39:00 - 40:00 mins
        • Uttar Pradesh(U.P) , Gorakhpur Implementation
    • ML Model Accuracy/Results
  • USE
    • Roadmap
    • Workspace Setup - Playbook
    • Saral App Reference Backend
    • Generating APK from source code
    • Generate AAB(App bundle) from source code
    • Sign already generated APK file with private Key
    • Layout configuration
    • Debug/Run Saral App from Android Studio
    • Saral App Debug Tips
    • Saral App Usage Guidelines
    • Update BASE_URL,apkURL in APK
    • Update BASE_URL,apkURL in AAB
    • Sign already generated AAB(Android App Bundle) file with private key
    • Google Play Store App Publish Considerations
    • Layout Design Guidelines
    • Saral OCR Assets
    • Firebase setup for Saral App Telemetry
    • Firebase setup for TFLite model deployment
    • Alternatives for Saral components
  • ENGAGE
    • Source Code Repository
    • Saral SDK Source Code Repository
    • Tracker
    • Releases
      • v1.0.0-beta.1
      • v1.0.0-beta.2
      • v1.0.0-beta.3
      • v1.0.0-beta.4
      • v1.0.0-beta5
      • v1.5.0
      • v1.5.1
      • v1.5.2
      • v1.5.3
      • v1.5.4
      • v1.5.5
      • v1.5.6
      • v1.5.7
      • v1.5.9
      • v1.6.0
      • v1.6.1
      • v1.6.2
      • v1.7.0
    • Saral - Solution Providers
    • Discuss
  • Experience Saral
  • Dev Environment - Installation & Maintenance
    • Saral Installation Guide (Non-Prod)
    • Saral - Sandbox Maintenance Guide (Non-Prod)
  • Saral Easy Installer
    • Saral Production-Environment User Installation Guide
      • Manual Installation for Prod
      • Automating the Infra provisioning and install of the Saral application
        • Prerequisites
        • What automation does
        • Run installer
        • Post install steps
        • Monitoring-Stack
    • Reference Documents
      • SARAL Infra Requirements & Associated Cost
      • Saral Infra Cost Benefit Analysis
  • Tool for Saral Easy Layout generation and Auto generate ROI json
Powered by GitBook
On this page

Was this helpful?

Edit on GitHub
  1. USE

Layout configuration

Layout configuration for Saral App to detect and predict

PreviousSign already generated APK file with private KeyNextDebug/Run Saral App from Android Studio

Last updated 2 years ago

Was this helpful?

Refer Layout specification

  1. Capture the printed layout using saral app for tagging.

    1. Set debug option to true to store the image in android mobile. Modify below java file under /saralsdk and change below highlighted flags to true.

    SaralSDKOpenCVScannerActivity.java

       mTableCornerDetection           = new TableCornerCirclesDetection(true);
       mDetectShaded                   = new DetectShaded(true);
    1. With the above changes, run the scan on the mobile.

    2. Use the below command to grab the android level logs and search for 'SrlSDK::CVOps: Saving file:' and 'SrlSDK::DetectShaded: Saving file:'. This will give you path to where the saved images are stored in your android phone.

    adb logcat

    1. Use the below command to pull the images finally. adb pull <image path in android phone found in above logs> example: adb pull /storage/emulated/0/Android/data/com.saralapp/files/Download/table_4Fg.jpg

2. Tools like can be used to tag ROIs with the layout. This will give a raw VoTT Json with ROI coordinates.

3. Use to transform raw VoTT Json to target Layout specification json format. Existing layout transformation notebooks can be referred @ /specs/v1.5/jupyter-notebook inthe .

4. This final layout json should be configured in backend GET /roi/{examId} API for each exam to enable the layout.

Starting from v1.5.0 release each layout can have threshold minimum width,minimum height and detectionRadius configured. This configuration is used by App to detect the sheet only if minimum width and height between dark corner circles is met with scanned sheet/layout. This is to avoid invalid scans which may result in unwanted predictions.

Sample layout with threshould minWidth, minHeight and radius(radius of 4 corner dark circles for alignment) configuration.

threshold\experimentalOMRDetection flag is used to enable experimental feature to detection OMR filled with pencil or any color pen. This experimental feature will be available from v1.5.6 release or above.

  "layout": {
      "version": "1.0",
      "name": "HINDI8S13QOMR Exam Sheet Form",
      "threshold": {
          "minWidth" : 500,
          "minHeight": 200,
          "detectionRadius": 12,
          "experimentalOMRDetection": true
      },            
      "cells": [
          {
              "cellId": "1",
              "rois": [
                  {
                      "annotationTags": "ROLLNUMBERID1_1",
                      "extractionMethod": "NUMERIC_CLASSIFICATION",
                      "roiId": "1",
                      "index": 0,
                      "rect": {
                          "top": 151,
                          "left": 54,
                          "bottom": 178,
                          "right": 69
                      }
                  },

Extraction Methods Supported

extractionMethod
Description

NUMERIC_CLASSIFICATION

This extraction method is used for handwritten digits 0 to 9.

CELL_OMR

This extraction method is used for omr detection. Added experimentalOMRDetectionflag to support OMR filled with pencil,any color ball pens.

BLOCK_ALPHANUMERIC_CLASSIFICATION

This extraction method is used for Alphanumeric detection . For example Address filed can hold block letters with digits. This feature is available from v1.5.6 release and above.

MS VoTT
Jupyter Notebook
Repository