Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Save data in database online #16

Open
OneaimDev opened this issue Oct 9, 2020 · 13 comments
Open

Save data in database online #16

OneaimDev opened this issue Oct 9, 2020 · 13 comments

Comments

@OneaimDev
Copy link

how to save and retrieve data from database online like firebase ?

@venkataramanaduddu
Copy link

venkataramanaduddu commented Oct 27, 2020

Did you done this thing?

@venkataramanaduddu
Copy link

venkataramanaduddu commented Oct 27, 2020

Can you share that code for me. I'm also trying same way but for me it's not possible to do that.
Please share the code for this mail id [email protected]

@deepsingh132
Copy link

Please share the updated code with firebase on [email protected]

@sichrif
Copy link

sichrif commented Feb 11, 2021

please share code with firebase on: [email protected]

@panwarunionitc
Copy link

please sharecode with me [email protected]

@kushalkundu
Copy link

For Firebase just replace this code in TFLiteObjectDetectionAPIModel class

import android.annotation.SuppressLint;
import android.content.res.AssetFileDescriptor;
import android.content.res.AssetManager;
import android.graphics.Bitmap;
import android.graphics.RectF;
import android.net.Uri;
import android.os.Trace;
import android.util.Log;
import android.util.Pair;
import android.widget.Toast;

import androidx.annotation.NonNull;

import com.example.rapidsoftfacerecogniser.MainActivity;
import com.example.rapidsoftfacerecogniser.env.Logger;
import com.google.android.gms.tasks.OnFailureListener;
import com.google.android.gms.tasks.OnSuccessListener;
import com.google.firebase.storage.FileDownloadTask;
import com.google.firebase.storage.FirebaseStorage;
import com.google.firebase.storage.StorageReference;
import com.google.firebase.storage.UploadTask;
import com.google.gson.Gson;
import com.google.gson.reflect.TypeToken;

import org.tensorflow.lite.Interpreter;

import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.lang.reflect.Type;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Vector;

/**

  • Wrapper for frozen detection models trained using the Tensorflow Object Detection API:

  • where you can find the training code.

  • To use pretrained models in the API or convert to TF Lite models, please see docs for details:

  • private static final Logger LOGGER = new Logger();

    //private static final int OUTPUT_SIZE = 512;
    private static final int OUTPUT_SIZE = 192;

    // Only return this many results.
    private static final int NUM_DETECTIONS = 1;

    // Float model
    private static final float IMAGE_MEAN = 128.0f;
    private static final float IMAGE_STD = 128.0f;

    // Number of threads in the java app
    private static final int NUM_THREADS = 4;
    private boolean isModelQuantized;
    // Config values.
    private int inputSize;
    // Pre-allocated buffers.
    private Vector labels = new Vector();
    private int[] intValues;
    // outputLocations: array of shape [Batchsize, NUM_DETECTIONS,4]
    // contains the location of detected boxes
    private float[][][] outputLocations;
    // outputClasses: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the classes of detected boxes
    private float[][] outputClasses;
    // outputScores: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the scores of detected boxes
    private float[][] outputScores;
    // numDetections: array of shape [Batchsize]
    // contains the number of detected boxes
    private float[] numDetections;

    private float[][] embeedings;

    private ByteBuffer imgData;

    private Interpreter tfLite;

    // Face Mask Detector Output
    private float[][] output;

    private HashMap<String, Recognition> registered = new HashMap<>();

    @SuppressLint("LongLogTag")
    public void register(String name, Recognition rec, MainActivity det) {
    registered.put(name, rec);

     byte[] bytes = null;
     try {
    
         //  file.createNewFile();
         //write the bytes in file
         {
             Gson gson = new Gson();
    
    
             File localFile = new File(det.getFilesDir(), FileName);
             FileOutputStream fileOutputStream = new FileOutputStream(localFile);
    
             Type type = new TypeToken<HashMap<String, Recognition>>() {
             }.getType();
             String toStoreObject = gson.toJson(registered, type);
    
             ObjectOutputStream o = new ObjectOutputStream(fileOutputStream);
             o.writeObject(toStoreObject);
             //o.writeObject(registered);
    
             o.close();
             /* 26 */
             fileOutputStream.close();
    
             Toast.makeText(det.getApplicationContext(), "save file completed.", Toast.LENGTH_LONG).show();
    
             Log.d(TAG, " file created: ");
             ///     file.delete();
             Log.d(TAG, "File deleted ");
         }
    
         FirebaseStorage storage = FirebaseStorage.getInstance();
         StorageReference storageRef = storage.getReference();
         StorageReference test2 = storageRef.child(FileName);
         //test2.delete();
         //test2.putStream();
    
         Uri file = Uri.fromFile(new File(det.getFilesDir(), FileName));
    
    
         test2.putFile(file)
                 .addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
                     @Override
                     public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
                         // Get a URL to the uploaded content
                         //Uri downloadUrl = taskSnapshot.get();
                         Toast.makeText(det.getApplicationContext(), "Upload Completed.", Toast.LENGTH_LONG).show();
    
                     }
                 })
                 .addOnFailureListener(new OnFailureListener() {
                     @Override
                     public void onFailure(@NonNull Exception exception) {
                         // Handle unsuccessful uploads
                         // ...
                         Toast.makeText(det.getApplicationContext(), "Upload Failure.", Toast.LENGTH_LONG).show();
                     }
                 });
    
         Log.d(TAG, "Clique Aqui Enviou ");
    
    
     } catch (Exception e) {
    
    
         Log.d(TAG, " file created: " + e.toString());
    
         //Log.d("Clique AQUI","Clique AQUI file created: " + bytes.length);
         Toast.makeText(det.getApplicationContext(), e.getMessage(), Toast.LENGTH_LONG).show();
    
     }
    

    }

    @SuppressLint("LongLogTag")
    private TFLiteObjectDetectionAPIModel() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "TFLiteObjectDetectionAPIModel Called");
    }

    /**

    • Memory-map the model file in Assets.
      */
      @SuppressLint("LongLogTag")
      private static MappedByteBuffer loadModelFile(AssetManager assets, String modelFilename)
      throws IOException {
      Log.d("Class TFLiteObjectDetectionAPIModel :", "loadModelFile Called");
      AssetFileDescriptor fileDescriptor = assets.openFd(modelFilename);
      FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
      FileChannel fileChannel = inputStream.getChannel();
      long startOffset = fileDescriptor.getStartOffset();
      long declaredLength = fileDescriptor.getDeclaredLength();
      return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
      }

    /**

    • Initializes a native TensorFlow session for classifying images.

    • @param assetManager The asset manager to be used to load assets.

    • @param modelFilename The filepath of the model GraphDef protocol buffer.

    • @param labelFilename The filepath of label file for classes.

    • @param inputSize The size of image input

    • @param isQuantized Boolean representing model is quantized or not
      */
      @SuppressLint("LongLogTag")
      public static SimilarityClassifier create(
      final AssetManager assetManager,
      final String modelFilename,
      final String labelFilename,
      final int inputSize,
      final boolean isQuantized, MainActivity det)
      throws IOException {

      final TFLiteObjectDetectionAPIModel d = new TFLiteObjectDetectionAPIModel();

      try {
      //Toast.makeText(det.getApplicationContext(), "name is null", Toast.LENGTH_LONG ).show();

       FirebaseStorage storage = FirebaseStorage.getInstance();
       StorageReference storageRef = storage.getReference();
       StorageReference test2 = storageRef.child(FileName);
      
       File localFile = File.createTempFile("Student", ".txt");
       //File localFile = new File(det.getFilesDir(),"test2.txt");
       test2.getFile(localFile).addOnSuccessListener(new OnSuccessListener<FileDownloadTask.TaskSnapshot>() {
           @Override
           public void onSuccess(FileDownloadTask.TaskSnapshot taskSnapshot) {
      
               try {
      
                   Gson gson = new Gson();
                   ObjectInputStream i = new ObjectInputStream(new FileInputStream(localFile));
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   Type type = new TypeToken<HashMap<String, Recognition>>() {}.getType();
                   HashMap<String, Recognition> registeredl = gson.fromJson((String) i.readObject(), type);
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   if (registeredl != null) {
                       d.registered = registeredl;
                   }
                   i.close();
      
                   Toast.makeText(det.getApplicationContext(), ".", Toast.LENGTH_LONG).show();
                   Log.d("Clique AQUI", "Clique Aqui Adicionado " + registeredl.size());
      
               } catch (Exception e) {
                   Log.d("Clique AQUI", "Clique Aqui erro " + e.toString());
                   Toast.makeText(det.getApplicationContext(), "Exception 1" + e.getMessage(), Toast.LENGTH_LONG).show();
               }
           }
       }).addOnFailureListener(new OnFailureListener() {
           @Override
           public void onFailure(@NonNull Exception exception) {
               Log.d("Clique AQUI", "Clique Aqui erro " + exception.toString());
               Toast.makeText(det.getApplicationContext(), "Exception 2 " + exception.getMessage(), Toast.LENGTH_LONG).show();
           }
       });
      

      } catch (Exception e) {

       Log.d("Clique AQUI", "Clique AQUI file created: " + e.toString());
      

      }

      String actualFilename = labelFilename.split("file:///android_asset/")[1];
      InputStream labelsInput = assetManager.open(actualFilename);
      BufferedReader br = new BufferedReader(new InputStreamReader(labelsInput));
      String line;
      while ((line = br.readLine()) != null) {
      LOGGER.w(line);
      d.labels.add(line);
      }
      br.close();

      d.inputSize = inputSize;

      try {
      d.tfLite = new Interpreter(loadModelFile(assetManager, modelFilename));
      } catch (Exception e) {
      throw new RuntimeException(e);
      }

      d.isModelQuantized = isQuantized;
      // Pre-allocate buffers.
      int numBytesPerChannel;
      if (isQuantized) {
      numBytesPerChannel = 1; // Quantized
      } else {
      numBytesPerChannel = 4; // Floating point
      }
      d.imgData = ByteBuffer.allocateDirect(d.inputSize * d.inputSize * 3 * numBytesPerChannel);
      d.imgData.order(ByteOrder.nativeOrder());
      d.intValues = new int[d.inputSize * d.inputSize];

      d.tfLite.setNumThreads(NUM_THREADS);
      d.outputLocations = new float[1][NUM_DETECTIONS][4];
      d.outputClasses = new float[1][NUM_DETECTIONS];
      d.outputScores = new float[1][NUM_DETECTIONS];
      d.numDetections = new float[1];
      return d;
      }

    @SuppressLint("LongLogTag")
    @OverRide
    public List recognizeImage(final Bitmap bitmap, boolean storeExtra) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "recognizeImage Called");
    // Log this method so that it can be analyzed with systrace.
    Trace.beginSection("recognizeImage");

     Trace.beginSection("preprocessBitmap");
     // Preprocess the image data from 0-255 int to normalized float based
     // on the provided parameters.
     bitmap.getPixels(intValues, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
    
     imgData.rewind();
     for (int i = 0; i < inputSize; ++i) {
         for (int j = 0; j < inputSize; ++j) {
             int pixelValue = intValues[i * inputSize + j];
             if (isModelQuantized) {
                 // Quantized model
                 imgData.put((byte) ((pixelValue >> 16) & 0xFF));
                 imgData.put((byte) ((pixelValue >> 8) & 0xFF));
                 imgData.put((byte) (pixelValue & 0xFF));
             } else { // Float model
                 imgData.putFloat((((pixelValue >> 16) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat((((pixelValue >> 8) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat(((pixelValue & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
             }
         }
     }
     Trace.endSection(); // preprocessBitmap
    
     // Copy the input data into TensorFlow.
     Trace.beginSection("feed");
    
    
     Object[] inputArray = {imgData};
    
     Trace.endSection();
    

// Here outputMap is changed to fit the Face Mask detector
Map<Integer, Object> outputMap = new HashMap<>();

    embeedings = new float[1][OUTPUT_SIZE];
    outputMap.put(0, embeedings);


    // Run the inference call.
    Trace.beginSection("run");
    //tfLite.runForMultipleInputsOutputs(inputArray, outputMapBack);
    tfLite.runForMultipleInputsOutputs(inputArray, outputMap);
    Trace.endSection();

// String res = "[";
// for (int i = 0; i < embeedings[0].length; i++) {
// res += embeedings[0][i];
// if (i < embeedings[0].length - 1) res += ", ";
// }
// res += "]";

    float distance = Float.MAX_VALUE;
    String id = "0";
    String label = "?";

    if (registered.size() > 0) {
        //LOGGER.i("dataset SIZE: " + registered.size());
        final Pair<String, Float> nearest = findNearest(embeedings[0]);
        if (nearest != null) {

            final String name = nearest.first;
            label = name;
            distance = nearest.second;

            LOGGER.i("nearest: " + name + " - distance: " + distance);


        }
    }


    final int numDetectionsOutput = 1;
    final ArrayList<Recognition> recognitions = new ArrayList<>(numDetectionsOutput);
    Recognition rec = new Recognition(
            id,
            label,
            distance,
            new RectF());

    recognitions.add(rec);

    if (storeExtra) {
        rec.setExtra(embeedings);
    }

    Trace.endSection();
    return recognitions;
}

private Pair<String, Float> findNearest(float[] emb) {

    Gson gson = new Gson();

    Pair<String, Float> ret = null;

    for (Map.Entry<String, Recognition> entry : registered.entrySet()) {
        String name = entry.getKey();

        float distance = 0;
        try {

            // original code
            //final float[] knownEmb = ((float[][]) entry.getValue().getExtra())[0];

            // -------------------- MODIFY --------------------------------------------------------------/
            float[][] knownEmb2d = gson.fromJson(entry.getValue().getExtra().toString(), float[][].class);
            final float[] knownEmb = knownEmb2d[0];

            for (int i = 0; i < emb.length; i++) {
                float diff = emb[i] - knownEmb[i];
                distance += diff * diff;
            }
        } catch (Exception e) {
            //Toast.makeText(context, e.getMessage(), Toast.LENGTH_LONG ).show();
            Log.e("findNearest", e.getMessage());
        }
        distance = (float) Math.sqrt(distance);
        if (ret == null || distance < ret.second) {
            ret = new Pair<>(name, distance);
        }
    }

    return ret;
}

@SuppressLint("LongLogTag")
@Override
public void enableStatLogging(final boolean logStats) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "enableStatLogging Called");
}

@SuppressLint("LongLogTag")
@Override
public String getStatString() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "getStatString Called");
    return "";
}

@SuppressLint("LongLogTag")
@Override
public void close() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "close Called");
}

@SuppressLint("LongLogTag")
public void setNumThreads(int num_threads) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setNumThreads Called");
    if (tfLite != null) tfLite.setNumThreads(num_threads);
}

@SuppressLint("LongLogTag")
@Override
public void setUseNNAPI(boolean isChecked) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setUseNNAPI Called");
    if (tfLite != null) tfLite.setUseNNAPI(isChecked);
}

}

@LeiiY
Copy link

LeiiY commented Aug 31, 2021

For Firebase just replace this code in TFLiteObjectDetectionAPIModel class

import android.annotation.SuppressLint;
import android.content.res.AssetFileDescriptor;
import android.content.res.AssetManager;
import android.graphics.Bitmap;
import android.graphics.RectF;
import android.net.Uri;
import android.os.Trace;
import android.util.Log;
import android.util.Pair;
import android.widget.Toast;

import androidx.annotation.NonNull;

import com.example.rapidsoftfacerecogniser.MainActivity;
import com.example.rapidsoftfacerecogniser.env.Logger;
import com.google.android.gms.tasks.OnFailureListener;
import com.google.android.gms.tasks.OnSuccessListener;
import com.google.firebase.storage.FileDownloadTask;
import com.google.firebase.storage.FirebaseStorage;
import com.google.firebase.storage.StorageReference;
import com.google.firebase.storage.UploadTask;
import com.google.gson.Gson;
import com.google.gson.reflect.TypeToken;

import org.tensorflow.lite.Interpreter;

import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.lang.reflect.Type;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Vector;

/**

  • Wrapper for frozen detection models trained using the Tensorflow Object Detection API:

  • where you can find the training code.

  • To use pretrained models in the API or convert to TF Lite models, please see docs for details:

  • private static final Logger LOGGER = new Logger();
    //private static final int OUTPUT_SIZE = 512;
    private static final int OUTPUT_SIZE = 192;
    // Only return this many results.
    private static final int NUM_DETECTIONS = 1;
    // Float model
    private static final float IMAGE_MEAN = 128.0f;
    private static final float IMAGE_STD = 128.0f;
    // Number of threads in the java app
    private static final int NUM_THREADS = 4;
    private boolean isModelQuantized;
    // Config values.
    private int inputSize;
    // Pre-allocated buffers.
    private Vector labels = new Vector();
    private int[] intValues;
    // outputLocations: array of shape [Batchsize, NUM_DETECTIONS,4]
    // contains the location of detected boxes
    private float[][][] outputLocations;
    // outputClasses: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the classes of detected boxes
    private float[][] outputClasses;
    // outputScores: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the scores of detected boxes
    private float[][] outputScores;
    // numDetections: array of shape [Batchsize]
    // contains the number of detected boxes
    private float[] numDetections;
    private float[][] embeedings;
    private ByteBuffer imgData;
    private Interpreter tfLite;
    // Face Mask Detector Output
    private float[][] output;
    private HashMap<String, Recognition> registered = new HashMap<>();
    @SuppressLint("LongLogTag")
    public void register(String name, Recognition rec, MainActivity det) {
    registered.put(name, rec);

     byte[] bytes = null;
     try {
    
         //  file.createNewFile();
         //write the bytes in file
         {
             Gson gson = new Gson();
    
    
             File localFile = new File(det.getFilesDir(), FileName);
             FileOutputStream fileOutputStream = new FileOutputStream(localFile);
    
             Type type = new TypeToken<HashMap<String, Recognition>>() {
             }.getType();
             String toStoreObject = gson.toJson(registered, type);
    
             ObjectOutputStream o = new ObjectOutputStream(fileOutputStream);
             o.writeObject(toStoreObject);
             //o.writeObject(registered);
    
             o.close();
             /* 26 */
             fileOutputStream.close();
    
             Toast.makeText(det.getApplicationContext(), "save file completed.", Toast.LENGTH_LONG).show();
    
             Log.d(TAG, " file created: ");
             ///     file.delete();
             Log.d(TAG, "File deleted ");
         }
    
         FirebaseStorage storage = FirebaseStorage.getInstance();
         StorageReference storageRef = storage.getReference();
         StorageReference test2 = storageRef.child(FileName);
         //test2.delete();
         //test2.putStream();
    
         Uri file = Uri.fromFile(new File(det.getFilesDir(), FileName));
    
    
         test2.putFile(file)
                 .addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
                     @Override
                     public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
                         // Get a URL to the uploaded content
                         //Uri downloadUrl = taskSnapshot.get();
                         Toast.makeText(det.getApplicationContext(), "Upload Completed.", Toast.LENGTH_LONG).show();
    
                     }
                 })
                 .addOnFailureListener(new OnFailureListener() {
                     @Override
                     public void onFailure(@NonNull Exception exception) {
                         // Handle unsuccessful uploads
                         // ...
                         Toast.makeText(det.getApplicationContext(), "Upload Failure.", Toast.LENGTH_LONG).show();
                     }
                 });
    
         Log.d(TAG, "Clique Aqui Enviou ");
    
    
     } catch (Exception e) {
    
    
         Log.d(TAG, " file created: " + e.toString());
    
         //Log.d("Clique AQUI","Clique AQUI file created: " + bytes.length);
         Toast.makeText(det.getApplicationContext(), e.getMessage(), Toast.LENGTH_LONG).show();
    
     }
    

    }
    @SuppressLint("LongLogTag")
    private TFLiteObjectDetectionAPIModel() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "TFLiteObjectDetectionAPIModel Called");
    }
    /**

    • Memory-map the model file in Assets.
      */
      @SuppressLint("LongLogTag")
      private static MappedByteBuffer loadModelFile(AssetManager assets, String modelFilename)
      throws IOException {
      Log.d("Class TFLiteObjectDetectionAPIModel :", "loadModelFile Called");
      AssetFileDescriptor fileDescriptor = assets.openFd(modelFilename);
      FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
      FileChannel fileChannel = inputStream.getChannel();
      long startOffset = fileDescriptor.getStartOffset();
      long declaredLength = fileDescriptor.getDeclaredLength();
      return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
      }

    /**

    • Initializes a native TensorFlow session for classifying images.

    • @param assetManager The asset manager to be used to load assets.

    • @param modelFilename The filepath of the model GraphDef protocol buffer.

    • @param labelFilename The filepath of label file for classes.

    • @param inputSize The size of image input

    • @param isQuantized Boolean representing model is quantized or not
      */
      @SuppressLint("LongLogTag")
      public static SimilarityClassifier create(
      final AssetManager assetManager,
      final String modelFilename,
      final String labelFilename,
      final int inputSize,
      final boolean isQuantized, MainActivity det)
      throws IOException {
      final TFLiteObjectDetectionAPIModel d = new TFLiteObjectDetectionAPIModel();
      try {
      //Toast.makeText(det.getApplicationContext(), "name is null", Toast.LENGTH_LONG ).show();

       FirebaseStorage storage = FirebaseStorage.getInstance();
       StorageReference storageRef = storage.getReference();
       StorageReference test2 = storageRef.child(FileName);
      
       File localFile = File.createTempFile("Student", ".txt");
       //File localFile = new File(det.getFilesDir(),"test2.txt");
       test2.getFile(localFile).addOnSuccessListener(new OnSuccessListener<FileDownloadTask.TaskSnapshot>() {
           @Override
           public void onSuccess(FileDownloadTask.TaskSnapshot taskSnapshot) {
      
               try {
      
                   Gson gson = new Gson();
                   ObjectInputStream i = new ObjectInputStream(new FileInputStream(localFile));
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   Type type = new TypeToken<HashMap<String, Recognition>>() {}.getType();
                   HashMap<String, Recognition> registeredl = gson.fromJson((String) i.readObject(), type);
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   if (registeredl != null) {
                       d.registered = registeredl;
                   }
                   i.close();
      
                   Toast.makeText(det.getApplicationContext(), ".", Toast.LENGTH_LONG).show();
                   Log.d("Clique AQUI", "Clique Aqui Adicionado " + registeredl.size());
      
               } catch (Exception e) {
                   Log.d("Clique AQUI", "Clique Aqui erro " + e.toString());
                   Toast.makeText(det.getApplicationContext(), "Exception 1" + e.getMessage(), Toast.LENGTH_LONG).show();
               }
           }
       }).addOnFailureListener(new OnFailureListener() {
           @Override
           public void onFailure(@NonNull Exception exception) {
               Log.d("Clique AQUI", "Clique Aqui erro " + exception.toString());
               Toast.makeText(det.getApplicationContext(), "Exception 2 " + exception.getMessage(), Toast.LENGTH_LONG).show();
           }
       });
      

      } catch (Exception e) {

       Log.d("Clique AQUI", "Clique AQUI file created: " + e.toString());
      

      }
      String actualFilename = labelFilename.split("file:///android_asset/")[1];
      InputStream labelsInput = assetManager.open(actualFilename);
      BufferedReader br = new BufferedReader(new InputStreamReader(labelsInput));
      String line;
      while ((line = br.readLine()) != null) {
      LOGGER.w(line);
      d.labels.add(line);
      }
      br.close();
      d.inputSize = inputSize;
      try {
      d.tfLite = new Interpreter(loadModelFile(assetManager, modelFilename));
      } catch (Exception e) {
      throw new RuntimeException(e);
      }
      d.isModelQuantized = isQuantized;
      // Pre-allocate buffers.
      int numBytesPerChannel;
      if (isQuantized) {
      numBytesPerChannel = 1; // Quantized
      } else {
      numBytesPerChannel = 4; // Floating point
      }
      d.imgData = ByteBuffer.allocateDirect(d.inputSize * d.inputSize * 3 * numBytesPerChannel);
      d.imgData.order(ByteOrder.nativeOrder());
      d.intValues = new int[d.inputSize * d.inputSize];
      d.tfLite.setNumThreads(NUM_THREADS);
      d.outputLocations = new float[1][NUM_DETECTIONS][4];
      d.outputClasses = new float[1][NUM_DETECTIONS];
      d.outputScores = new float[1][NUM_DETECTIONS];
      d.numDetections = new float[1];
      return d;
      }

    @SuppressLint("LongLogTag")
    @OverRide
    public List recognizeImage(final Bitmap bitmap, boolean storeExtra) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "recognizeImage Called");
    // Log this method so that it can be analyzed with systrace.
    Trace.beginSection("recognizeImage");

     Trace.beginSection("preprocessBitmap");
     // Preprocess the image data from 0-255 int to normalized float based
     // on the provided parameters.
     bitmap.getPixels(intValues, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
    
     imgData.rewind();
     for (int i = 0; i < inputSize; ++i) {
         for (int j = 0; j < inputSize; ++j) {
             int pixelValue = intValues[i * inputSize + j];
             if (isModelQuantized) {
                 // Quantized model
                 imgData.put((byte) ((pixelValue >> 16) & 0xFF));
                 imgData.put((byte) ((pixelValue >> 8) & 0xFF));
                 imgData.put((byte) (pixelValue & 0xFF));
             } else { // Float model
                 imgData.putFloat((((pixelValue >> 16) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat((((pixelValue >> 8) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat(((pixelValue & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
             }
         }
     }
     Trace.endSection(); // preprocessBitmap
    
     // Copy the input data into TensorFlow.
     Trace.beginSection("feed");
    
    
     Object[] inputArray = {imgData};
    
     Trace.endSection();
    

// Here outputMap is changed to fit the Face Mask detector
Map<Integer, Object> outputMap = new HashMap<>();

    embeedings = new float[1][OUTPUT_SIZE];
    outputMap.put(0, embeedings);


    // Run the inference call.
    Trace.beginSection("run");
    //tfLite.runForMultipleInputsOutputs(inputArray, outputMapBack);
    tfLite.runForMultipleInputsOutputs(inputArray, outputMap);
    Trace.endSection();

// String res = "[";
// for (int i = 0; i < embeedings[0].length; i++) {
// res += embeedings[0][i];
// if (i < embeedings[0].length - 1) res += ", ";
// }
// res += "]";

    float distance = Float.MAX_VALUE;
    String id = "0";
    String label = "?";

    if (registered.size() > 0) {
        //LOGGER.i("dataset SIZE: " + registered.size());
        final Pair<String, Float> nearest = findNearest(embeedings[0]);
        if (nearest != null) {

            final String name = nearest.first;
            label = name;
            distance = nearest.second;

            LOGGER.i("nearest: " + name + " - distance: " + distance);


        }
    }


    final int numDetectionsOutput = 1;
    final ArrayList<Recognition> recognitions = new ArrayList<>(numDetectionsOutput);
    Recognition rec = new Recognition(
            id,
            label,
            distance,
            new RectF());

    recognitions.add(rec);

    if (storeExtra) {
        rec.setExtra(embeedings);
    }

    Trace.endSection();
    return recognitions;
}

private Pair<String, Float> findNearest(float[] emb) {

    Gson gson = new Gson();

    Pair<String, Float> ret = null;

    for (Map.Entry<String, Recognition> entry : registered.entrySet()) {
        String name = entry.getKey();

        float distance = 0;
        try {

            // original code
            //final float[] knownEmb = ((float[][]) entry.getValue().getExtra())[0];

            // -------------------- MODIFY --------------------------------------------------------------/
            float[][] knownEmb2d = gson.fromJson(entry.getValue().getExtra().toString(), float[][].class);
            final float[] knownEmb = knownEmb2d[0];

            for (int i = 0; i < emb.length; i++) {
                float diff = emb[i] - knownEmb[i];
                distance += diff * diff;
            }
        } catch (Exception e) {
            //Toast.makeText(context, e.getMessage(), Toast.LENGTH_LONG ).show();
            Log.e("findNearest", e.getMessage());
        }
        distance = (float) Math.sqrt(distance);
        if (ret == null || distance < ret.second) {
            ret = new Pair<>(name, distance);
        }
    }

    return ret;
}

@SuppressLint("LongLogTag")
@Override
public void enableStatLogging(final boolean logStats) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "enableStatLogging Called");
}

@SuppressLint("LongLogTag")
@Override
public String getStatString() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "getStatString Called");
    return "";
}

@SuppressLint("LongLogTag")
@Override
public void close() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "close Called");
}

@SuppressLint("LongLogTag")
public void setNumThreads(int num_threads) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setNumThreads Called");
    if (tfLite != null) tfLite.setNumThreads(num_threads);
}

@SuppressLint("LongLogTag")
@Override
public void setUseNNAPI(boolean isChecked) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setUseNNAPI Called");
    if (tfLite != null) tfLite.setUseNNAPI(isChecked);
}

}

hello,could u zip ur project for me?
my emails is [email protected]
many thx

@kushalkundu
Copy link

kushalkundu commented Sep 2, 2021

For Firebase just replace this code in TFLiteObjectDetectionAPIModel class
import android.annotation.SuppressLint;
import android.content.res.AssetFileDescriptor;
import android.content.res.AssetManager;
import android.graphics.Bitmap;
import android.graphics.RectF;
import android.net.Uri;
import android.os.Trace;
import android.util.Log;
import android.util.Pair;
import android.widget.Toast;
import androidx.annotation.NonNull;
import com.example.rapidsoftfacerecogniser.MainActivity;
import com.example.rapidsoftfacerecogniser.env.Logger;
import com.google.android.gms.tasks.OnFailureListener;
import com.google.android.gms.tasks.OnSuccessListener;
import com.google.firebase.storage.FileDownloadTask;
import com.google.firebase.storage.FirebaseStorage;
import com.google.firebase.storage.StorageReference;
import com.google.firebase.storage.UploadTask;
import com.google.gson.Gson;
import com.google.gson.reflect.TypeToken;
import org.tensorflow.lite.Interpreter;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.lang.reflect.Type;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Vector;
/**

  • Wrapper for frozen detection models trained using the Tensorflow Object Detection API:

  • where you can find the training code.

  • To use pretrained models in the API or convert to TF Lite models, please see docs for details:

  • private static final Logger LOGGER = new Logger();
    //private static final int OUTPUT_SIZE = 512;
    private static final int OUTPUT_SIZE = 192;
    // Only return this many results.
    private static final int NUM_DETECTIONS = 1;
    // Float model
    private static final float IMAGE_MEAN = 128.0f;
    private static final float IMAGE_STD = 128.0f;
    // Number of threads in the java app
    private static final int NUM_THREADS = 4;
    private boolean isModelQuantized;
    // Config values.
    private int inputSize;
    // Pre-allocated buffers.
    private Vector labels = new Vector();
    private int[] intValues;
    // outputLocations: array of shape [Batchsize, NUM_DETECTIONS,4]
    // contains the location of detected boxes
    private float[][][] outputLocations;
    // outputClasses: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the classes of detected boxes
    private float[][] outputClasses;
    // outputScores: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the scores of detected boxes
    private float[][] outputScores;
    // numDetections: array of shape [Batchsize]
    // contains the number of detected boxes
    private float[] numDetections;
    private float[][] embeedings;
    private ByteBuffer imgData;
    private Interpreter tfLite;
    // Face Mask Detector Output
    private float[][] output;
    private HashMap<String, Recognition> registered = new HashMap<>();
    @SuppressLint("LongLogTag")
    public void register(String name, Recognition rec, MainActivity det) {
    registered.put(name, rec);

     byte[] bytes = null;
     try {
    
         //  file.createNewFile();
         //write the bytes in file
         {
             Gson gson = new Gson();
    
    
             File localFile = new File(det.getFilesDir(), FileName);
             FileOutputStream fileOutputStream = new FileOutputStream(localFile);
    
             Type type = new TypeToken<HashMap<String, Recognition>>() {
             }.getType();
             String toStoreObject = gson.toJson(registered, type);
    
             ObjectOutputStream o = new ObjectOutputStream(fileOutputStream);
             o.writeObject(toStoreObject);
             //o.writeObject(registered);
    
             o.close();
             /* 26 */
             fileOutputStream.close();
    
             Toast.makeText(det.getApplicationContext(), "save file completed.", Toast.LENGTH_LONG).show();
    
             Log.d(TAG, " file created: ");
             ///     file.delete();
             Log.d(TAG, "File deleted ");
         }
    
         FirebaseStorage storage = FirebaseStorage.getInstance();
         StorageReference storageRef = storage.getReference();
         StorageReference test2 = storageRef.child(FileName);
         //test2.delete();
         //test2.putStream();
    
         Uri file = Uri.fromFile(new File(det.getFilesDir(), FileName));
    
    
         test2.putFile(file)
                 .addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
                     @Override
                     public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
                         // Get a URL to the uploaded content
                         //Uri downloadUrl = taskSnapshot.get();
                         Toast.makeText(det.getApplicationContext(), "Upload Completed.", Toast.LENGTH_LONG).show();
    
                     }
                 })
                 .addOnFailureListener(new OnFailureListener() {
                     @Override
                     public void onFailure(@NonNull Exception exception) {
                         // Handle unsuccessful uploads
                         // ...
                         Toast.makeText(det.getApplicationContext(), "Upload Failure.", Toast.LENGTH_LONG).show();
                     }
                 });
    
         Log.d(TAG, "Clique Aqui Enviou ");
    
    
     } catch (Exception e) {
    
    
         Log.d(TAG, " file created: " + e.toString());
    
         //Log.d("Clique AQUI","Clique AQUI file created: " + bytes.length);
         Toast.makeText(det.getApplicationContext(), e.getMessage(), Toast.LENGTH_LONG).show();
    
     }
    

    }
    @SuppressLint("LongLogTag")
    private TFLiteObjectDetectionAPIModel() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "TFLiteObjectDetectionAPIModel Called");
    }
    /**

    • Memory-map the model file in Assets.
      */
      @SuppressLint("LongLogTag")
      private static MappedByteBuffer loadModelFile(AssetManager assets, String modelFilename)
      throws IOException {
      Log.d("Class TFLiteObjectDetectionAPIModel :", "loadModelFile Called");
      AssetFileDescriptor fileDescriptor = assets.openFd(modelFilename);
      FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
      FileChannel fileChannel = inputStream.getChannel();
      long startOffset = fileDescriptor.getStartOffset();
      long declaredLength = fileDescriptor.getDeclaredLength();
      return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
      }

    /**

    • Initializes a native TensorFlow session for classifying images.

    • @param assetManager The asset manager to be used to load assets.

    • @param modelFilename The filepath of the model GraphDef protocol buffer.

    • @param labelFilename The filepath of label file for classes.

    • @param inputSize The size of image input

    • @param isQuantized Boolean representing model is quantized or not
      */
      @SuppressLint("LongLogTag")
      public static SimilarityClassifier create(
      final AssetManager assetManager,
      final String modelFilename,
      final String labelFilename,
      final int inputSize,
      final boolean isQuantized, MainActivity det)
      throws IOException {
      final TFLiteObjectDetectionAPIModel d = new TFLiteObjectDetectionAPIModel();
      try {
      //Toast.makeText(det.getApplicationContext(), "name is null", Toast.LENGTH_LONG ).show();

       FirebaseStorage storage = FirebaseStorage.getInstance();
       StorageReference storageRef = storage.getReference();
       StorageReference test2 = storageRef.child(FileName);
      
       File localFile = File.createTempFile("Student", ".txt");
       //File localFile = new File(det.getFilesDir(),"test2.txt");
       test2.getFile(localFile).addOnSuccessListener(new OnSuccessListener<FileDownloadTask.TaskSnapshot>() {
           @Override
           public void onSuccess(FileDownloadTask.TaskSnapshot taskSnapshot) {
      
               try {
      
                   Gson gson = new Gson();
                   ObjectInputStream i = new ObjectInputStream(new FileInputStream(localFile));
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   Type type = new TypeToken<HashMap<String, Recognition>>() {}.getType();
                   HashMap<String, Recognition> registeredl = gson.fromJson((String) i.readObject(), type);
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   if (registeredl != null) {
                       d.registered = registeredl;
                   }
                   i.close();
      
                   Toast.makeText(det.getApplicationContext(), ".", Toast.LENGTH_LONG).show();
                   Log.d("Clique AQUI", "Clique Aqui Adicionado " + registeredl.size());
      
               } catch (Exception e) {
                   Log.d("Clique AQUI", "Clique Aqui erro " + e.toString());
                   Toast.makeText(det.getApplicationContext(), "Exception 1" + e.getMessage(), Toast.LENGTH_LONG).show();
               }
           }
       }).addOnFailureListener(new OnFailureListener() {
           @Override
           public void onFailure(@NonNull Exception exception) {
               Log.d("Clique AQUI", "Clique Aqui erro " + exception.toString());
               Toast.makeText(det.getApplicationContext(), "Exception 2 " + exception.getMessage(), Toast.LENGTH_LONG).show();
           }
       });
      

      } catch (Exception e) {

       Log.d("Clique AQUI", "Clique AQUI file created: " + e.toString());
      

      }
      String actualFilename = labelFilename.split("file:///android_asset/")[1];
      InputStream labelsInput = assetManager.open(actualFilename);
      BufferedReader br = new BufferedReader(new InputStreamReader(labelsInput));
      String line;
      while ((line = br.readLine()) != null) {
      LOGGER.w(line);
      d.labels.add(line);
      }
      br.close();
      d.inputSize = inputSize;
      try {
      d.tfLite = new Interpreter(loadModelFile(assetManager, modelFilename));
      } catch (Exception e) {
      throw new RuntimeException(e);
      }
      d.isModelQuantized = isQuantized;
      // Pre-allocate buffers.
      int numBytesPerChannel;
      if (isQuantized) {
      numBytesPerChannel = 1; // Quantized
      } else {
      numBytesPerChannel = 4; // Floating point
      }
      d.imgData = ByteBuffer.allocateDirect(d.inputSize * d.inputSize * 3 * numBytesPerChannel);
      d.imgData.order(ByteOrder.nativeOrder());
      d.intValues = new int[d.inputSize * d.inputSize];
      d.tfLite.setNumThreads(NUM_THREADS);
      d.outputLocations = new float[1][NUM_DETECTIONS][4];
      d.outputClasses = new float[1][NUM_DETECTIONS];
      d.outputScores = new float[1][NUM_DETECTIONS];
      d.numDetections = new float[1];
      return d;
      }

    @SuppressLint("LongLogTag")
    @OverRide
    public List recognizeImage(final Bitmap bitmap, boolean storeExtra) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "recognizeImage Called");
    // Log this method so that it can be analyzed with systrace.
    Trace.beginSection("recognizeImage");

     Trace.beginSection("preprocessBitmap");
     // Preprocess the image data from 0-255 int to normalized float based
     // on the provided parameters.
     bitmap.getPixels(intValues, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
    
     imgData.rewind();
     for (int i = 0; i < inputSize; ++i) {
         for (int j = 0; j < inputSize; ++j) {
             int pixelValue = intValues[i * inputSize + j];
             if (isModelQuantized) {
                 // Quantized model
                 imgData.put((byte) ((pixelValue >> 16) & 0xFF));
                 imgData.put((byte) ((pixelValue >> 8) & 0xFF));
                 imgData.put((byte) (pixelValue & 0xFF));
             } else { // Float model
                 imgData.putFloat((((pixelValue >> 16) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat((((pixelValue >> 8) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat(((pixelValue & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
             }
         }
     }
     Trace.endSection(); // preprocessBitmap
    
     // Copy the input data into TensorFlow.
     Trace.beginSection("feed");
    
    
     Object[] inputArray = {imgData};
    
     Trace.endSection();
    

// Here outputMap is changed to fit the Face Mask detector
Map<Integer, Object> outputMap = new HashMap<>();

    embeedings = new float[1][OUTPUT_SIZE];
    outputMap.put(0, embeedings);


    // Run the inference call.
    Trace.beginSection("run");
    //tfLite.runForMultipleInputsOutputs(inputArray, outputMapBack);
    tfLite.runForMultipleInputsOutputs(inputArray, outputMap);
    Trace.endSection();

// String res = "[";
// for (int i = 0; i < embeedings[0].length; i++) {
// res += embeedings[0][i];
// if (i < embeedings[0].length - 1) res += ", ";
// }
// res += "]";

    float distance = Float.MAX_VALUE;
    String id = "0";
    String label = "?";

    if (registered.size() > 0) {
        //LOGGER.i("dataset SIZE: " + registered.size());
        final Pair<String, Float> nearest = findNearest(embeedings[0]);
        if (nearest != null) {

            final String name = nearest.first;
            label = name;
            distance = nearest.second;

            LOGGER.i("nearest: " + name + " - distance: " + distance);


        }
    }


    final int numDetectionsOutput = 1;
    final ArrayList<Recognition> recognitions = new ArrayList<>(numDetectionsOutput);
    Recognition rec = new Recognition(
            id,
            label,
            distance,
            new RectF());

    recognitions.add(rec);

    if (storeExtra) {
        rec.setExtra(embeedings);
    }

    Trace.endSection();
    return recognitions;
}

private Pair<String, Float> findNearest(float[] emb) {

    Gson gson = new Gson();

    Pair<String, Float> ret = null;

    for (Map.Entry<String, Recognition> entry : registered.entrySet()) {
        String name = entry.getKey();

        float distance = 0;
        try {

            // original code
            //final float[] knownEmb = ((float[][]) entry.getValue().getExtra())[0];

            // -------------------- MODIFY --------------------------------------------------------------/
            float[][] knownEmb2d = gson.fromJson(entry.getValue().getExtra().toString(), float[][].class);
            final float[] knownEmb = knownEmb2d[0];

            for (int i = 0; i < emb.length; i++) {
                float diff = emb[i] - knownEmb[i];
                distance += diff * diff;
            }
        } catch (Exception e) {
            //Toast.makeText(context, e.getMessage(), Toast.LENGTH_LONG ).show();
            Log.e("findNearest", e.getMessage());
        }
        distance = (float) Math.sqrt(distance);
        if (ret == null || distance < ret.second) {
            ret = new Pair<>(name, distance);
        }
    }

    return ret;
}

@SuppressLint("LongLogTag")
@Override
public void enableStatLogging(final boolean logStats) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "enableStatLogging Called");
}

@SuppressLint("LongLogTag")
@Override
public String getStatString() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "getStatString Called");
    return "";
}

@SuppressLint("LongLogTag")
@Override
public void close() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "close Called");
}

@SuppressLint("LongLogTag")
public void setNumThreads(int num_threads) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setNumThreads Called");
    if (tfLite != null) tfLite.setNumThreads(num_threads);
}

@SuppressLint("LongLogTag")
@Override
public void setUseNNAPI(boolean isChecked) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setUseNNAPI Called");
    if (tfLite != null) tfLite.setUseNNAPI(isChecked);
}

}

hello,could u zip ur project for me?
my emails is [email protected]
many thx

you just need to take the main code zip and replace my code in TflLiteObjectDetectionApiModel Class, also add permissions of read and write storage in detector Activity's request Permission function and in manifest along with storage, internet too, but if you still face problems let me know I'll help you.

@waxdss
Copy link

waxdss commented Jan 13, 2022

For Firebase just replace this code in TFLiteObjectDetectionAPIModel class
import android.annotation.SuppressLint;
import android.content.res.AssetFileDescriptor;
import android.content.res.AssetManager;
import android.graphics.Bitmap;
import android.graphics.RectF;
import android.net.Uri;
import android.os.Trace;
import android.util.Log;
import android.util.Pair;
import android.widget.Toast;
import androidx.annotation.NonNull;
import com.example.rapidsoftfacerecogniser.MainActivity;
import com.example.rapidsoftfacerecogniser.env.Logger;
import com.google.android.gms.tasks.OnFailureListener;
import com.google.android.gms.tasks.OnSuccessListener;
import com.google.firebase.storage.FileDownloadTask;
import com.google.firebase.storage.FirebaseStorage;
import com.google.firebase.storage.StorageReference;
import com.google.firebase.storage.UploadTask;
import com.google.gson.Gson;
import com.google.gson.reflect.TypeToken;
import org.tensorflow.lite.Interpreter;
import java.io.BufferedReader;
import java.io.File;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.lang.reflect.Type;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.MappedByteBuffer;
import java.nio.channels.FileChannel;
import java.util.ArrayList;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.Vector;
/**

  • Wrapper for frozen detection models trained using the Tensorflow Object Detection API:

  • where you can find the training code.

  • To use pretrained models in the API or convert to TF Lite models, please see docs for details:

  • private static final Logger LOGGER = new Logger();
    //private static final int OUTPUT_SIZE = 512;
    private static final int OUTPUT_SIZE = 192;
    // Only return this many results.
    private static final int NUM_DETECTIONS = 1;
    // Float model
    private static final float IMAGE_MEAN = 128.0f;
    private static final float IMAGE_STD = 128.0f;
    // Number of threads in the java app
    private static final int NUM_THREADS = 4;
    private boolean isModelQuantized;
    // Config values.
    private int inputSize;
    // Pre-allocated buffers.
    private Vector labels = new Vector();
    private int[] intValues;
    // outputLocations: array of shape [Batchsize, NUM_DETECTIONS,4]
    // contains the location of detected boxes
    private float[][][] outputLocations;
    // outputClasses: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the classes of detected boxes
    private float[][] outputClasses;
    // outputScores: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the scores of detected boxes
    private float[][] outputScores;
    // numDetections: array of shape [Batchsize]
    // contains the number of detected boxes
    private float[] numDetections;
    private float[][] embeedings;
    private ByteBuffer imgData;
    private Interpreter tfLite;
    // Face Mask Detector Output
    private float[][] output;
    private HashMap<String, Recognition> registered = new HashMap<>();
    @SuppressLint("LongLogTag")
    public void register(String name, Recognition rec, MainActivity det) {
    registered.put(name, rec);

     byte[] bytes = null;
     try {
    
         //  file.createNewFile();
         //write the bytes in file
         {
             Gson gson = new Gson();
    
    
             File localFile = new File(det.getFilesDir(), FileName);
             FileOutputStream fileOutputStream = new FileOutputStream(localFile);
    
             Type type = new TypeToken<HashMap<String, Recognition>>() {
             }.getType();
             String toStoreObject = gson.toJson(registered, type);
    
             ObjectOutputStream o = new ObjectOutputStream(fileOutputStream);
             o.writeObject(toStoreObject);
             //o.writeObject(registered);
    
             o.close();
             /* 26 */
             fileOutputStream.close();
    
             Toast.makeText(det.getApplicationContext(), "save file completed.", Toast.LENGTH_LONG).show();
    
             Log.d(TAG, " file created: ");
             ///     file.delete();
             Log.d(TAG, "File deleted ");
         }
    
         FirebaseStorage storage = FirebaseStorage.getInstance();
         StorageReference storageRef = storage.getReference();
         StorageReference test2 = storageRef.child(FileName);
         //test2.delete();
         //test2.putStream();
    
         Uri file = Uri.fromFile(new File(det.getFilesDir(), FileName));
    
    
         test2.putFile(file)
                 .addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
                     @Override
                     public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
                         // Get a URL to the uploaded content
                         //Uri downloadUrl = taskSnapshot.get();
                         Toast.makeText(det.getApplicationContext(), "Upload Completed.", Toast.LENGTH_LONG).show();
    
                     }
                 })
                 .addOnFailureListener(new OnFailureListener() {
                     @Override
                     public void onFailure(@NonNull Exception exception) {
                         // Handle unsuccessful uploads
                         // ...
                         Toast.makeText(det.getApplicationContext(), "Upload Failure.", Toast.LENGTH_LONG).show();
                     }
                 });
    
         Log.d(TAG, "Clique Aqui Enviou ");
    
    
     } catch (Exception e) {
    
    
         Log.d(TAG, " file created: " + e.toString());
    
         //Log.d("Clique AQUI","Clique AQUI file created: " + bytes.length);
         Toast.makeText(det.getApplicationContext(), e.getMessage(), Toast.LENGTH_LONG).show();
    
     }
    

    }
    @SuppressLint("LongLogTag")
    private TFLiteObjectDetectionAPIModel() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "TFLiteObjectDetectionAPIModel Called");
    }
    /**

    • Memory-map the model file in Assets.
      */
      @SuppressLint("LongLogTag")
      private static MappedByteBuffer loadModelFile(AssetManager assets, String modelFilename)
      throws IOException {
      Log.d("Class TFLiteObjectDetectionAPIModel :", "loadModelFile Called");
      AssetFileDescriptor fileDescriptor = assets.openFd(modelFilename);
      FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
      FileChannel fileChannel = inputStream.getChannel();
      long startOffset = fileDescriptor.getStartOffset();
      long declaredLength = fileDescriptor.getDeclaredLength();
      return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
      }

    /**

    • Initializes a native TensorFlow session for classifying images.

    • @param assetManager The asset manager to be used to load assets.

    • @param modelFilename The filepath of the model GraphDef protocol buffer.

    • @param labelFilename The filepath of label file for classes.

    • @param inputSize The size of image input

    • @param isQuantized Boolean representing model is quantized or not
      */
      @SuppressLint("LongLogTag")
      public static SimilarityClassifier create(
      final AssetManager assetManager,
      final String modelFilename,
      final String labelFilename,
      final int inputSize,
      final boolean isQuantized, MainActivity det)
      throws IOException {
      final TFLiteObjectDetectionAPIModel d = new TFLiteObjectDetectionAPIModel();
      try {
      //Toast.makeText(det.getApplicationContext(), "name is null", Toast.LENGTH_LONG ).show();

       FirebaseStorage storage = FirebaseStorage.getInstance();
       StorageReference storageRef = storage.getReference();
       StorageReference test2 = storageRef.child(FileName);
      
       File localFile = File.createTempFile("Student", ".txt");
       //File localFile = new File(det.getFilesDir(),"test2.txt");
       test2.getFile(localFile).addOnSuccessListener(new OnSuccessListener<FileDownloadTask.TaskSnapshot>() {
           @Override
           public void onSuccess(FileDownloadTask.TaskSnapshot taskSnapshot) {
      
               try {
      
                   Gson gson = new Gson();
                   ObjectInputStream i = new ObjectInputStream(new FileInputStream(localFile));
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   Type type = new TypeToken<HashMap<String, Recognition>>() {}.getType();
                   HashMap<String, Recognition> registeredl = gson.fromJson((String) i.readObject(), type);
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   if (registeredl != null) {
                       d.registered = registeredl;
                   }
                   i.close();
      
                   Toast.makeText(det.getApplicationContext(), ".", Toast.LENGTH_LONG).show();
                   Log.d("Clique AQUI", "Clique Aqui Adicionado " + registeredl.size());
      
               } catch (Exception e) {
                   Log.d("Clique AQUI", "Clique Aqui erro " + e.toString());
                   Toast.makeText(det.getApplicationContext(), "Exception 1" + e.getMessage(), Toast.LENGTH_LONG).show();
               }
           }
       }).addOnFailureListener(new OnFailureListener() {
           @Override
           public void onFailure(@NonNull Exception exception) {
               Log.d("Clique AQUI", "Clique Aqui erro " + exception.toString());
               Toast.makeText(det.getApplicationContext(), "Exception 2 " + exception.getMessage(), Toast.LENGTH_LONG).show();
           }
       });
      

      } catch (Exception e) {

       Log.d("Clique AQUI", "Clique AQUI file created: " + e.toString());
      

      }
      String actualFilename = labelFilename.split("file:///android_asset/")[1];
      InputStream labelsInput = assetManager.open(actualFilename);
      BufferedReader br = new BufferedReader(new InputStreamReader(labelsInput));
      String line;
      while ((line = br.readLine()) != null) {
      LOGGER.w(line);
      d.labels.add(line);
      }
      br.close();
      d.inputSize = inputSize;
      try {
      d.tfLite = new Interpreter(loadModelFile(assetManager, modelFilename));
      } catch (Exception e) {
      throw new RuntimeException(e);
      }
      d.isModelQuantized = isQuantized;
      // Pre-allocate buffers.
      int numBytesPerChannel;
      if (isQuantized) {
      numBytesPerChannel = 1; // Quantized
      } else {
      numBytesPerChannel = 4; // Floating point
      }
      d.imgData = ByteBuffer.allocateDirect(d.inputSize * d.inputSize * 3 * numBytesPerChannel);
      d.imgData.order(ByteOrder.nativeOrder());
      d.intValues = new int[d.inputSize * d.inputSize];
      d.tfLite.setNumThreads(NUM_THREADS);
      d.outputLocations = new float[1][NUM_DETECTIONS][4];
      d.outputClasses = new float[1][NUM_DETECTIONS];
      d.outputScores = new float[1][NUM_DETECTIONS];
      d.numDetections = new float[1];
      return d;
      }

    @SuppressLint("LongLogTag")
    @OverRide
    public List recognizeImage(final Bitmap bitmap, boolean storeExtra) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "recognizeImage Called");
    // Log this method so that it can be analyzed with systrace.
    Trace.beginSection("recognizeImage");

     Trace.beginSection("preprocessBitmap");
     // Preprocess the image data from 0-255 int to normalized float based
     // on the provided parameters.
     bitmap.getPixels(intValues, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
    
     imgData.rewind();
     for (int i = 0; i < inputSize; ++i) {
         for (int j = 0; j < inputSize; ++j) {
             int pixelValue = intValues[i * inputSize + j];
             if (isModelQuantized) {
                 // Quantized model
                 imgData.put((byte) ((pixelValue >> 16) & 0xFF));
                 imgData.put((byte) ((pixelValue >> 8) & 0xFF));
                 imgData.put((byte) (pixelValue & 0xFF));
             } else { // Float model
                 imgData.putFloat((((pixelValue >> 16) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat((((pixelValue >> 8) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat(((pixelValue & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
             }
         }
     }
     Trace.endSection(); // preprocessBitmap
    
     // Copy the input data into TensorFlow.
     Trace.beginSection("feed");
    
    
     Object[] inputArray = {imgData};
    
     Trace.endSection();
    

// Here outputMap is changed to fit the Face Mask detector
Map<Integer, Object> outputMap = new HashMap<>();

    embeedings = new float[1][OUTPUT_SIZE];
    outputMap.put(0, embeedings);


    // Run the inference call.
    Trace.beginSection("run");
    //tfLite.runForMultipleInputsOutputs(inputArray, outputMapBack);
    tfLite.runForMultipleInputsOutputs(inputArray, outputMap);
    Trace.endSection();

// String res = "[";
// for (int i = 0; i < embeedings[0].length; i++) {
// res += embeedings[0][i];
// if (i < embeedings[0].length - 1) res += ", ";
// }
// res += "]";

    float distance = Float.MAX_VALUE;
    String id = "0";
    String label = "?";

    if (registered.size() > 0) {
        //LOGGER.i("dataset SIZE: " + registered.size());
        final Pair<String, Float> nearest = findNearest(embeedings[0]);
        if (nearest != null) {

            final String name = nearest.first;
            label = name;
            distance = nearest.second;

            LOGGER.i("nearest: " + name + " - distance: " + distance);


        }
    }


    final int numDetectionsOutput = 1;
    final ArrayList<Recognition> recognitions = new ArrayList<>(numDetectionsOutput);
    Recognition rec = new Recognition(
            id,
            label,
            distance,
            new RectF());

    recognitions.add(rec);

    if (storeExtra) {
        rec.setExtra(embeedings);
    }

    Trace.endSection();
    return recognitions;
}

private Pair<String, Float> findNearest(float[] emb) {

    Gson gson = new Gson();

    Pair<String, Float> ret = null;

    for (Map.Entry<String, Recognition> entry : registered.entrySet()) {
        String name = entry.getKey();

        float distance = 0;
        try {

            // original code
            //final float[] knownEmb = ((float[][]) entry.getValue().getExtra())[0];

            // -------------------- MODIFY --------------------------------------------------------------/
            float[][] knownEmb2d = gson.fromJson(entry.getValue().getExtra().toString(), float[][].class);
            final float[] knownEmb = knownEmb2d[0];

            for (int i = 0; i < emb.length; i++) {
                float diff = emb[i] - knownEmb[i];
                distance += diff * diff;
            }
        } catch (Exception e) {
            //Toast.makeText(context, e.getMessage(), Toast.LENGTH_LONG ).show();
            Log.e("findNearest", e.getMessage());
        }
        distance = (float) Math.sqrt(distance);
        if (ret == null || distance < ret.second) {
            ret = new Pair<>(name, distance);
        }
    }

    return ret;
}

@SuppressLint("LongLogTag")
@Override
public void enableStatLogging(final boolean logStats) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "enableStatLogging Called");
}

@SuppressLint("LongLogTag")
@Override
public String getStatString() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "getStatString Called");
    return "";
}

@SuppressLint("LongLogTag")
@Override
public void close() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "close Called");
}

@SuppressLint("LongLogTag")
public void setNumThreads(int num_threads) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setNumThreads Called");
    if (tfLite != null) tfLite.setNumThreads(num_threads);
}

@SuppressLint("LongLogTag")
@Override
public void setUseNNAPI(boolean isChecked) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setUseNNAPI Called");
    if (tfLite != null) tfLite.setUseNNAPI(isChecked);
}

}

hello,could u zip ur project for me?
my emails is [email protected]
many thx

you just need to take the main code zip and replace my code in TflLiteObjectDetectionApiModel Class, also add permissions of read and write storage in detector Activity's request Permission function and in manifest along with storage, internet too, but if you still face problems let me know I'll help you.

I faced problems as Cannot resolve symbol 'example'
which library i should add? Sorry if it's stupid question. Thank you.

@waxdss
Copy link

waxdss commented Jan 22, 2022

For Firebase just replace this code in TFLiteObjectDetectionAPIModel class

import android.annotation.SuppressLint; import android.content.res.AssetFileDescriptor; import android.content.res.AssetManager; import android.graphics.Bitmap; import android.graphics.RectF; import android.net.Uri; import android.os.Trace; import android.util.Log; import android.util.Pair; import android.widget.Toast;

import androidx.annotation.NonNull;

import com.example.rapidsoftfacerecogniser.MainActivity; import com.example.rapidsoftfacerecogniser.env.Logger; import com.google.android.gms.tasks.OnFailureListener; import com.google.android.gms.tasks.OnSuccessListener; import com.google.firebase.storage.FileDownloadTask; import com.google.firebase.storage.FirebaseStorage; import com.google.firebase.storage.StorageReference; import com.google.firebase.storage.UploadTask; import com.google.gson.Gson; import com.google.gson.reflect.TypeToken;

import org.tensorflow.lite.Interpreter;

import java.io.BufferedReader; import java.io.File; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.IOException; import java.io.InputStream; import java.io.InputStreamReader; import java.io.ObjectInputStream; import java.io.ObjectOutputStream; import java.lang.reflect.Type; import java.nio.ByteBuffer; import java.nio.ByteOrder; import java.nio.MappedByteBuffer; import java.nio.channels.FileChannel; import java.util.ArrayList; import java.util.HashMap; import java.util.List; import java.util.Map; import java.util.Vector;

/**

  • Wrapper for frozen detection models trained using the Tensorflow Object Detection API:

  • where you can find the training code.

  • To use pretrained models in the API or convert to TF Lite models, please see docs for details:

  • private static final Logger LOGGER = new Logger();
    //private static final int OUTPUT_SIZE = 512;
    private static final int OUTPUT_SIZE = 192;
    // Only return this many results.
    private static final int NUM_DETECTIONS = 1;
    // Float model
    private static final float IMAGE_MEAN = 128.0f;
    private static final float IMAGE_STD = 128.0f;
    // Number of threads in the java app
    private static final int NUM_THREADS = 4;
    private boolean isModelQuantized;
    // Config values.
    private int inputSize;
    // Pre-allocated buffers.
    private Vector labels = new Vector();
    private int[] intValues;
    // outputLocations: array of shape [Batchsize, NUM_DETECTIONS,4]
    // contains the location of detected boxes
    private float[][][] outputLocations;
    // outputClasses: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the classes of detected boxes
    private float[][] outputClasses;
    // outputScores: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the scores of detected boxes
    private float[][] outputScores;
    // numDetections: array of shape [Batchsize]
    // contains the number of detected boxes
    private float[] numDetections;
    private float[][] embeedings;
    private ByteBuffer imgData;
    private Interpreter tfLite;
    // Face Mask Detector Output
    private float[][] output;
    private HashMap<String, Recognition> registered = new HashMap<>();
    @SuppressLint("LongLogTag")
    public void register(String name, Recognition rec, MainActivity det) {
    registered.put(name, rec);

     byte[] bytes = null;
     try {
    
         //  file.createNewFile();
         //write the bytes in file
         {
             Gson gson = new Gson();
    
    
             File localFile = new File(det.getFilesDir(), FileName);
             FileOutputStream fileOutputStream = new FileOutputStream(localFile);
    
             Type type = new TypeToken<HashMap<String, Recognition>>() {
             }.getType();
             String toStoreObject = gson.toJson(registered, type);
    
             ObjectOutputStream o = new ObjectOutputStream(fileOutputStream);
             o.writeObject(toStoreObject);
             //o.writeObject(registered);
    
             o.close();
             /* 26 */
             fileOutputStream.close();
    
             Toast.makeText(det.getApplicationContext(), "save file completed.", Toast.LENGTH_LONG).show();
    
             Log.d(TAG, " file created: ");
             ///     file.delete();
             Log.d(TAG, "File deleted ");
         }
    
         FirebaseStorage storage = FirebaseStorage.getInstance();
         StorageReference storageRef = storage.getReference();
         StorageReference test2 = storageRef.child(FileName);
         //test2.delete();
         //test2.putStream();
    
         Uri file = Uri.fromFile(new File(det.getFilesDir(), FileName));
    
    
         test2.putFile(file)
                 .addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
                     @Override
                     public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
                         // Get a URL to the uploaded content
                         //Uri downloadUrl = taskSnapshot.get();
                         Toast.makeText(det.getApplicationContext(), "Upload Completed.", Toast.LENGTH_LONG).show();
    
                     }
                 })
                 .addOnFailureListener(new OnFailureListener() {
                     @Override
                     public void onFailure(@NonNull Exception exception) {
                         // Handle unsuccessful uploads
                         // ...
                         Toast.makeText(det.getApplicationContext(), "Upload Failure.", Toast.LENGTH_LONG).show();
                     }
                 });
    
         Log.d(TAG, "Clique Aqui Enviou ");
    
    
     } catch (Exception e) {
    
    
         Log.d(TAG, " file created: " + e.toString());
    
         //Log.d("Clique AQUI","Clique AQUI file created: " + bytes.length);
         Toast.makeText(det.getApplicationContext(), e.getMessage(), Toast.LENGTH_LONG).show();
    
     }
    

    }
    @SuppressLint("LongLogTag")
    private TFLiteObjectDetectionAPIModel() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "TFLiteObjectDetectionAPIModel Called");
    }
    /**

    • Memory-map the model file in Assets.
      */
      @SuppressLint("LongLogTag")
      private static MappedByteBuffer loadModelFile(AssetManager assets, String modelFilename)
      throws IOException {
      Log.d("Class TFLiteObjectDetectionAPIModel :", "loadModelFile Called");
      AssetFileDescriptor fileDescriptor = assets.openFd(modelFilename);
      FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
      FileChannel fileChannel = inputStream.getChannel();
      long startOffset = fileDescriptor.getStartOffset();
      long declaredLength = fileDescriptor.getDeclaredLength();
      return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
      }

    /**

    • Initializes a native TensorFlow session for classifying images.

    • @param assetManager The asset manager to be used to load assets.

    • @param modelFilename The filepath of the model GraphDef protocol buffer.

    • @param labelFilename The filepath of label file for classes.

    • @param inputSize The size of image input

    • @param isQuantized Boolean representing model is quantized or not
      */
      @SuppressLint("LongLogTag")
      public static SimilarityClassifier create(
      final AssetManager assetManager,
      final String modelFilename,
      final String labelFilename,
      final int inputSize,
      final boolean isQuantized, MainActivity det)
      throws IOException {
      final TFLiteObjectDetectionAPIModel d = new TFLiteObjectDetectionAPIModel();
      try {
      //Toast.makeText(det.getApplicationContext(), "name is null", Toast.LENGTH_LONG ).show();

       FirebaseStorage storage = FirebaseStorage.getInstance();
       StorageReference storageRef = storage.getReference();
       StorageReference test2 = storageRef.child(FileName);
      
       File localFile = File.createTempFile("Student", ".txt");
       //File localFile = new File(det.getFilesDir(),"test2.txt");
       test2.getFile(localFile).addOnSuccessListener(new OnSuccessListener<FileDownloadTask.TaskSnapshot>() {
           @Override
           public void onSuccess(FileDownloadTask.TaskSnapshot taskSnapshot) {
      
               try {
      
                   Gson gson = new Gson();
                   ObjectInputStream i = new ObjectInputStream(new FileInputStream(localFile));
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   Type type = new TypeToken<HashMap<String, Recognition>>() {}.getType();
                   HashMap<String, Recognition> registeredl = gson.fromJson((String) i.readObject(), type);
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   if (registeredl != null) {
                       d.registered = registeredl;
                   }
                   i.close();
      
                   Toast.makeText(det.getApplicationContext(), ".", Toast.LENGTH_LONG).show();
                   Log.d("Clique AQUI", "Clique Aqui Adicionado " + registeredl.size());
      
               } catch (Exception e) {
                   Log.d("Clique AQUI", "Clique Aqui erro " + e.toString());
                   Toast.makeText(det.getApplicationContext(), "Exception 1" + e.getMessage(), Toast.LENGTH_LONG).show();
               }
           }
       }).addOnFailureListener(new OnFailureListener() {
           @Override
           public void onFailure(@NonNull Exception exception) {
               Log.d("Clique AQUI", "Clique Aqui erro " + exception.toString());
               Toast.makeText(det.getApplicationContext(), "Exception 2 " + exception.getMessage(), Toast.LENGTH_LONG).show();
           }
       });
      

      } catch (Exception e) {

       Log.d("Clique AQUI", "Clique AQUI file created: " + e.toString());
      

      }
      String actualFilename = labelFilename.split("file:///android_asset/")[1];
      InputStream labelsInput = assetManager.open(actualFilename);
      BufferedReader br = new BufferedReader(new InputStreamReader(labelsInput));
      String line;
      while ((line = br.readLine()) != null) {
      LOGGER.w(line);
      d.labels.add(line);
      }
      br.close();
      d.inputSize = inputSize;
      try {
      d.tfLite = new Interpreter(loadModelFile(assetManager, modelFilename));
      } catch (Exception e) {
      throw new RuntimeException(e);
      }
      d.isModelQuantized = isQuantized;
      // Pre-allocate buffers.
      int numBytesPerChannel;
      if (isQuantized) {
      numBytesPerChannel = 1; // Quantized
      } else {
      numBytesPerChannel = 4; // Floating point
      }
      d.imgData = ByteBuffer.allocateDirect(d.inputSize * d.inputSize * 3 * numBytesPerChannel);
      d.imgData.order(ByteOrder.nativeOrder());
      d.intValues = new int[d.inputSize * d.inputSize];
      d.tfLite.setNumThreads(NUM_THREADS);
      d.outputLocations = new float[1][NUM_DETECTIONS][4];
      d.outputClasses = new float[1][NUM_DETECTIONS];
      d.outputScores = new float[1][NUM_DETECTIONS];
      d.numDetections = new float[1];
      return d;
      }

    @SuppressLint("LongLogTag")
    @OverRide
    public List recognizeImage(final Bitmap bitmap, boolean storeExtra) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "recognizeImage Called");
    // Log this method so that it can be analyzed with systrace.
    Trace.beginSection("recognizeImage");

     Trace.beginSection("preprocessBitmap");
     // Preprocess the image data from 0-255 int to normalized float based
     // on the provided parameters.
     bitmap.getPixels(intValues, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
    
     imgData.rewind();
     for (int i = 0; i < inputSize; ++i) {
         for (int j = 0; j < inputSize; ++j) {
             int pixelValue = intValues[i * inputSize + j];
             if (isModelQuantized) {
                 // Quantized model
                 imgData.put((byte) ((pixelValue >> 16) & 0xFF));
                 imgData.put((byte) ((pixelValue >> 8) & 0xFF));
                 imgData.put((byte) (pixelValue & 0xFF));
             } else { // Float model
                 imgData.putFloat((((pixelValue >> 16) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat((((pixelValue >> 8) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat(((pixelValue & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
             }
         }
     }
     Trace.endSection(); // preprocessBitmap
    
     // Copy the input data into TensorFlow.
     Trace.beginSection("feed");
    
    
     Object[] inputArray = {imgData};
    
     Trace.endSection();
    

// Here outputMap is changed to fit the Face Mask detector Map<Integer, Object> outputMap = new HashMap<>();

    embeedings = new float[1][OUTPUT_SIZE];
    outputMap.put(0, embeedings);


    // Run the inference call.
    Trace.beginSection("run");
    //tfLite.runForMultipleInputsOutputs(inputArray, outputMapBack);
    tfLite.runForMultipleInputsOutputs(inputArray, outputMap);
    Trace.endSection();

// String res = "["; // for (int i = 0; i < embeedings[0].length; i++) { // res += embeedings[0][i]; // if (i < embeedings[0].length - 1) res += ", "; // } // res += "]";

    float distance = Float.MAX_VALUE;
    String id = "0";
    String label = "?";

    if (registered.size() > 0) {
        //LOGGER.i("dataset SIZE: " + registered.size());
        final Pair<String, Float> nearest = findNearest(embeedings[0]);
        if (nearest != null) {

            final String name = nearest.first;
            label = name;
            distance = nearest.second;

            LOGGER.i("nearest: " + name + " - distance: " + distance);


        }
    }


    final int numDetectionsOutput = 1;
    final ArrayList<Recognition> recognitions = new ArrayList<>(numDetectionsOutput);
    Recognition rec = new Recognition(
            id,
            label,
            distance,
            new RectF());

    recognitions.add(rec);

    if (storeExtra) {
        rec.setExtra(embeedings);
    }

    Trace.endSection();
    return recognitions;
}

private Pair<String, Float> findNearest(float[] emb) {

    Gson gson = new Gson();

    Pair<String, Float> ret = null;

    for (Map.Entry<String, Recognition> entry : registered.entrySet()) {
        String name = entry.getKey();

        float distance = 0;
        try {

            // original code
            //final float[] knownEmb = ((float[][]) entry.getValue().getExtra())[0];

            // -------------------- MODIFY --------------------------------------------------------------/
            float[][] knownEmb2d = gson.fromJson(entry.getValue().getExtra().toString(), float[][].class);
            final float[] knownEmb = knownEmb2d[0];

            for (int i = 0; i < emb.length; i++) {
                float diff = emb[i] - knownEmb[i];
                distance += diff * diff;
            }
        } catch (Exception e) {
            //Toast.makeText(context, e.getMessage(), Toast.LENGTH_LONG ).show();
            Log.e("findNearest", e.getMessage());
        }
        distance = (float) Math.sqrt(distance);
        if (ret == null || distance < ret.second) {
            ret = new Pair<>(name, distance);
        }
    }

    return ret;
}

@SuppressLint("LongLogTag")
@Override
public void enableStatLogging(final boolean logStats) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "enableStatLogging Called");
}

@SuppressLint("LongLogTag")
@Override
public String getStatString() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "getStatString Called");
    return "";
}

@SuppressLint("LongLogTag")
@Override
public void close() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "close Called");
}

@SuppressLint("LongLogTag")
public void setNumThreads(int num_threads) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setNumThreads Called");
    if (tfLite != null) tfLite.setNumThreads(num_threads);
}

@SuppressLint("LongLogTag")
@Override
public void setUseNNAPI(boolean isChecked) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setUseNNAPI Called");
    if (tfLite != null) tfLite.setUseNNAPI(isChecked);
}

}

TFLiteObjectDetectionAPIModel.create(
getAssets(),
TF_OD_API_MODEL_FILE,
TF_OD_API_LABELS_FILE,
TF_OD_API_INPUT_SIZE,
TF_OD_API_IS_QUANTIZED);

'create(android.content.res.AssetManager, java.lang.String, java.lang.String, int, boolean, org.tensorflow.lite.examples.detection.CameraActivity)' in 'org.tensorflow.lite.examples.detection.tflite.TFLiteObjectDetectionAPIModel' cannot be applied to '(android.content.res.AssetManager, java.lang.String, java.lang.String, int, boolean)'

TFLiteObjectDetectionAPIModel.create(
^
required: AssetManager,String,String,int,boolean,CameraActivity
found: AssetManager,String,String,int,boolean
reason: actual and formal argument lists differ in length

I have this problem, and i didn't understand it. Sorry for bothering and also if it's stupid question. Thanks for advance.

@waxdss
Copy link

waxdss commented Jan 22, 2022

CameraActivity det

what does it mean

@Gatisha05
Copy link

For Firebase just replace this code in TFLiteObjectDetectionAPIModel class

import android.annotation.SuppressLint; import android.content.res.AssetFileDescriptor; import android.content.res.AssetManager; import android.graphics.Bitmap; import android.graphics.RectF; import android.net.Uri; import android.os.Trace; import android.util.Log; import android.util.Pair; import android.widget.Toast;

import androidx.annotation.NonNull;

import com.example.rapidsoftfacerecogniser.MainActivity; import com.example.rapidsoftfacerecogniser.env.Logger; import com.google.android.gms.tasks.OnFailureListener; import com.google.android.gms.tasks.OnSuccessListener; import com.google.firebase.storage.FileDownloadTask; import com.google.firebase.storage.FirebaseStorage; import com.google.firebase.storage.StorageReference; import com.google.firebase.storage.UploadTask; import com.google.gson.Gson; import com.google.gson.reflect.TypeToken;

import org.tensorflow.lite.Interpreter;

import java.io.BufferedReader; import java.io.File; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.IOException; import java.io.InputStream; import java.io.InputStreamReader; import java.io.ObjectInputStream; import java.io.ObjectOutputStream; import java.lang.reflect.Type; import java.nio.ByteBuffer; import java.nio.ByteOrder; import java.nio.MappedByteBuffer; import java.nio.channels.FileChannel; import java.util.ArrayList; import java.util.HashMap; import java.util.List; import java.util.Map; import java.util.Vector;

/**

  • Wrapper for frozen detection models trained using the Tensorflow Object Detection API:

  • where you can find the training code.

  • To use pretrained models in the API or convert to TF Lite models, please see docs for details:

  • private static final Logger LOGGER = new Logger();
    //private static final int OUTPUT_SIZE = 512;
    private static final int OUTPUT_SIZE = 192;
    // Only return this many results.
    private static final int NUM_DETECTIONS = 1;
    // Float model
    private static final float IMAGE_MEAN = 128.0f;
    private static final float IMAGE_STD = 128.0f;
    // Number of threads in the java app
    private static final int NUM_THREADS = 4;
    private boolean isModelQuantized;
    // Config values.
    private int inputSize;
    // Pre-allocated buffers.
    private Vector labels = new Vector();
    private int[] intValues;
    // outputLocations: array of shape [Batchsize, NUM_DETECTIONS,4]
    // contains the location of detected boxes
    private float[][][] outputLocations;
    // outputClasses: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the classes of detected boxes
    private float[][] outputClasses;
    // outputScores: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the scores of detected boxes
    private float[][] outputScores;
    // numDetections: array of shape [Batchsize]
    // contains the number of detected boxes
    private float[] numDetections;
    private float[][] embeedings;
    private ByteBuffer imgData;
    private Interpreter tfLite;
    // Face Mask Detector Output
    private float[][] output;
    private HashMap<String, Recognition> registered = new HashMap<>();
    @SuppressLint("LongLogTag")
    public void register(String name, Recognition rec, MainActivity det) {
    registered.put(name, rec);

     byte[] bytes = null;
     try {
    
         //  file.createNewFile();
         //write the bytes in file
         {
             Gson gson = new Gson();
    
    
             File localFile = new File(det.getFilesDir(), FileName);
             FileOutputStream fileOutputStream = new FileOutputStream(localFile);
    
             Type type = new TypeToken<HashMap<String, Recognition>>() {
             }.getType();
             String toStoreObject = gson.toJson(registered, type);
    
             ObjectOutputStream o = new ObjectOutputStream(fileOutputStream);
             o.writeObject(toStoreObject);
             //o.writeObject(registered);
    
             o.close();
             /* 26 */
             fileOutputStream.close();
    
             Toast.makeText(det.getApplicationContext(), "save file completed.", Toast.LENGTH_LONG).show();
    
             Log.d(TAG, " file created: ");
             ///     file.delete();
             Log.d(TAG, "File deleted ");
         }
    
         FirebaseStorage storage = FirebaseStorage.getInstance();
         StorageReference storageRef = storage.getReference();
         StorageReference test2 = storageRef.child(FileName);
         //test2.delete();
         //test2.putStream();
    
         Uri file = Uri.fromFile(new File(det.getFilesDir(), FileName));
    
    
         test2.putFile(file)
                 .addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
                     @Override
                     public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
                         // Get a URL to the uploaded content
                         //Uri downloadUrl = taskSnapshot.get();
                         Toast.makeText(det.getApplicationContext(), "Upload Completed.", Toast.LENGTH_LONG).show();
    
                     }
                 })
                 .addOnFailureListener(new OnFailureListener() {
                     @Override
                     public void onFailure(@NonNull Exception exception) {
                         // Handle unsuccessful uploads
                         // ...
                         Toast.makeText(det.getApplicationContext(), "Upload Failure.", Toast.LENGTH_LONG).show();
                     }
                 });
    
         Log.d(TAG, "Clique Aqui Enviou ");
    
    
     } catch (Exception e) {
    
    
         Log.d(TAG, " file created: " + e.toString());
    
         //Log.d("Clique AQUI","Clique AQUI file created: " + bytes.length);
         Toast.makeText(det.getApplicationContext(), e.getMessage(), Toast.LENGTH_LONG).show();
    
     }
    

    }
    @SuppressLint("LongLogTag")
    private TFLiteObjectDetectionAPIModel() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "TFLiteObjectDetectionAPIModel Called");
    }
    /**

    • Memory-map the model file in Assets.
      */
      @SuppressLint("LongLogTag")
      private static MappedByteBuffer loadModelFile(AssetManager assets, String modelFilename)
      throws IOException {
      Log.d("Class TFLiteObjectDetectionAPIModel :", "loadModelFile Called");
      AssetFileDescriptor fileDescriptor = assets.openFd(modelFilename);
      FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
      FileChannel fileChannel = inputStream.getChannel();
      long startOffset = fileDescriptor.getStartOffset();
      long declaredLength = fileDescriptor.getDeclaredLength();
      return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
      }

    /**

    • Initializes a native TensorFlow session for classifying images.

    • @param assetManager The asset manager to be used to load assets.

    • @param modelFilename The filepath of the model GraphDef protocol buffer.

    • @param labelFilename The filepath of label file for classes.

    • @param inputSize The size of image input

    • @param isQuantized Boolean representing model is quantized or not
      */
      @SuppressLint("LongLogTag")
      public static SimilarityClassifier create(
      final AssetManager assetManager,
      final String modelFilename,
      final String labelFilename,
      final int inputSize,
      final boolean isQuantized, MainActivity det)
      throws IOException {
      final TFLiteObjectDetectionAPIModel d = new TFLiteObjectDetectionAPIModel();
      try {
      //Toast.makeText(det.getApplicationContext(), "name is null", Toast.LENGTH_LONG ).show();

       FirebaseStorage storage = FirebaseStorage.getInstance();
       StorageReference storageRef = storage.getReference();
       StorageReference test2 = storageRef.child(FileName);
      
       File localFile = File.createTempFile("Student", ".txt");
       //File localFile = new File(det.getFilesDir(),"test2.txt");
       test2.getFile(localFile).addOnSuccessListener(new OnSuccessListener<FileDownloadTask.TaskSnapshot>() {
           @Override
           public void onSuccess(FileDownloadTask.TaskSnapshot taskSnapshot) {
      
               try {
      
                   Gson gson = new Gson();
                   ObjectInputStream i = new ObjectInputStream(new FileInputStream(localFile));
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   Type type = new TypeToken<HashMap<String, Recognition>>() {}.getType();
                   HashMap<String, Recognition> registeredl = gson.fromJson((String) i.readObject(), type);
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   if (registeredl != null) {
                       d.registered = registeredl;
                   }
                   i.close();
      
                   Toast.makeText(det.getApplicationContext(), ".", Toast.LENGTH_LONG).show();
                   Log.d("Clique AQUI", "Clique Aqui Adicionado " + registeredl.size());
      
               } catch (Exception e) {
                   Log.d("Clique AQUI", "Clique Aqui erro " + e.toString());
                   Toast.makeText(det.getApplicationContext(), "Exception 1" + e.getMessage(), Toast.LENGTH_LONG).show();
               }
           }
       }).addOnFailureListener(new OnFailureListener() {
           @Override
           public void onFailure(@NonNull Exception exception) {
               Log.d("Clique AQUI", "Clique Aqui erro " + exception.toString());
               Toast.makeText(det.getApplicationContext(), "Exception 2 " + exception.getMessage(), Toast.LENGTH_LONG).show();
           }
       });
      

      } catch (Exception e) {

       Log.d("Clique AQUI", "Clique AQUI file created: " + e.toString());
      

      }
      String actualFilename = labelFilename.split("file:///android_asset/")[1];
      InputStream labelsInput = assetManager.open(actualFilename);
      BufferedReader br = new BufferedReader(new InputStreamReader(labelsInput));
      String line;
      while ((line = br.readLine()) != null) {
      LOGGER.w(line);
      d.labels.add(line);
      }
      br.close();
      d.inputSize = inputSize;
      try {
      d.tfLite = new Interpreter(loadModelFile(assetManager, modelFilename));
      } catch (Exception e) {
      throw new RuntimeException(e);
      }
      d.isModelQuantized = isQuantized;
      // Pre-allocate buffers.
      int numBytesPerChannel;
      if (isQuantized) {
      numBytesPerChannel = 1; // Quantized
      } else {
      numBytesPerChannel = 4; // Floating point
      }
      d.imgData = ByteBuffer.allocateDirect(d.inputSize * d.inputSize * 3 * numBytesPerChannel);
      d.imgData.order(ByteOrder.nativeOrder());
      d.intValues = new int[d.inputSize * d.inputSize];
      d.tfLite.setNumThreads(NUM_THREADS);
      d.outputLocations = new float[1][NUM_DETECTIONS][4];
      d.outputClasses = new float[1][NUM_DETECTIONS];
      d.outputScores = new float[1][NUM_DETECTIONS];
      d.numDetections = new float[1];
      return d;
      }

    @SuppressLint("LongLogTag")
    @OverRide
    public List recognizeImage(final Bitmap bitmap, boolean storeExtra) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "recognizeImage Called");
    // Log this method so that it can be analyzed with systrace.
    Trace.beginSection("recognizeImage");

     Trace.beginSection("preprocessBitmap");
     // Preprocess the image data from 0-255 int to normalized float based
     // on the provided parameters.
     bitmap.getPixels(intValues, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
    
     imgData.rewind();
     for (int i = 0; i < inputSize; ++i) {
         for (int j = 0; j < inputSize; ++j) {
             int pixelValue = intValues[i * inputSize + j];
             if (isModelQuantized) {
                 // Quantized model
                 imgData.put((byte) ((pixelValue >> 16) & 0xFF));
                 imgData.put((byte) ((pixelValue >> 8) & 0xFF));
                 imgData.put((byte) (pixelValue & 0xFF));
             } else { // Float model
                 imgData.putFloat((((pixelValue >> 16) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat((((pixelValue >> 8) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat(((pixelValue & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
             }
         }
     }
     Trace.endSection(); // preprocessBitmap
    
     // Copy the input data into TensorFlow.
     Trace.beginSection("feed");
    
    
     Object[] inputArray = {imgData};
    
     Trace.endSection();
    

// Here outputMap is changed to fit the Face Mask detector Map<Integer, Object> outputMap = new HashMap<>();

    embeedings = new float[1][OUTPUT_SIZE];
    outputMap.put(0, embeedings);


    // Run the inference call.
    Trace.beginSection("run");
    //tfLite.runForMultipleInputsOutputs(inputArray, outputMapBack);
    tfLite.runForMultipleInputsOutputs(inputArray, outputMap);
    Trace.endSection();

// String res = "["; // for (int i = 0; i < embeedings[0].length; i++) { // res += embeedings[0][i]; // if (i < embeedings[0].length - 1) res += ", "; // } // res += "]";

    float distance = Float.MAX_VALUE;
    String id = "0";
    String label = "?";

    if (registered.size() > 0) {
        //LOGGER.i("dataset SIZE: " + registered.size());
        final Pair<String, Float> nearest = findNearest(embeedings[0]);
        if (nearest != null) {

            final String name = nearest.first;
            label = name;
            distance = nearest.second;

            LOGGER.i("nearest: " + name + " - distance: " + distance);


        }
    }


    final int numDetectionsOutput = 1;
    final ArrayList<Recognition> recognitions = new ArrayList<>(numDetectionsOutput);
    Recognition rec = new Recognition(
            id,
            label,
            distance,
            new RectF());

    recognitions.add(rec);

    if (storeExtra) {
        rec.setExtra(embeedings);
    }

    Trace.endSection();
    return recognitions;
}

private Pair<String, Float> findNearest(float[] emb) {

    Gson gson = new Gson();

    Pair<String, Float> ret = null;

    for (Map.Entry<String, Recognition> entry : registered.entrySet()) {
        String name = entry.getKey();

        float distance = 0;
        try {

            // original code
            //final float[] knownEmb = ((float[][]) entry.getValue().getExtra())[0];

            // -------------------- MODIFY --------------------------------------------------------------/
            float[][] knownEmb2d = gson.fromJson(entry.getValue().getExtra().toString(), float[][].class);
            final float[] knownEmb = knownEmb2d[0];

            for (int i = 0; i < emb.length; i++) {
                float diff = emb[i] - knownEmb[i];
                distance += diff * diff;
            }
        } catch (Exception e) {
            //Toast.makeText(context, e.getMessage(), Toast.LENGTH_LONG ).show();
            Log.e("findNearest", e.getMessage());
        }
        distance = (float) Math.sqrt(distance);
        if (ret == null || distance < ret.second) {
            ret = new Pair<>(name, distance);
        }
    }

    return ret;
}

@SuppressLint("LongLogTag")
@Override
public void enableStatLogging(final boolean logStats) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "enableStatLogging Called");
}

@SuppressLint("LongLogTag")
@Override
public String getStatString() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "getStatString Called");
    return "";
}

@SuppressLint("LongLogTag")
@Override
public void close() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "close Called");
}

@SuppressLint("LongLogTag")
public void setNumThreads(int num_threads) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setNumThreads Called");
    if (tfLite != null) tfLite.setNumThreads(num_threads);
}

@SuppressLint("LongLogTag")
@Override
public void setUseNNAPI(boolean isChecked) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setUseNNAPI Called");
    if (tfLite != null) tfLite.setUseNNAPI(isChecked);
}

}

@kushalkundu
Can you please share the code
mailId :- [email protected]

@Hyson-Wayne
Copy link

For Firebase just replace this code in TFLiteObjectDetectionAPIModel class

import android.annotation.SuppressLint; import android.content.res.AssetFileDescriptor; import android.content.res.AssetManager; import android.graphics.Bitmap; import android.graphics.RectF; import android.net.Uri; import android.os.Trace; import android.util.Log; import android.util.Pair; import android.widget.Toast;

import androidx.annotation.NonNull;

import com.example.rapidsoftfacerecogniser.MainActivity; import com.example.rapidsoftfacerecogniser.env.Logger; import com.google.android.gms.tasks.OnFailureListener; import com.google.android.gms.tasks.OnSuccessListener; import com.google.firebase.storage.FileDownloadTask; import com.google.firebase.storage.FirebaseStorage; import com.google.firebase.storage.StorageReference; import com.google.firebase.storage.UploadTask; import com.google.gson.Gson; import com.google.gson.reflect.TypeToken;

import org.tensorflow.lite.Interpreter;

import java.io.BufferedReader; import java.io.File; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.IOException; import java.io.InputStream; import java.io.InputStreamReader; import java.io.ObjectInputStream; import java.io.ObjectOutputStream; import java.lang.reflect.Type; import java.nio.ByteBuffer; import java.nio.ByteOrder; import java.nio.MappedByteBuffer; import java.nio.channels.FileChannel; import java.util.ArrayList; import java.util.HashMap; import java.util.List; import java.util.Map; import java.util.Vector;

/**

  • Wrapper for frozen detection models trained using the Tensorflow Object Detection API:

  • where you can find the training code.

  • To use pretrained models in the API or convert to TF Lite models, please see docs for details:

  • private static final Logger LOGGER = new Logger();
    //private static final int OUTPUT_SIZE = 512;
    private static final int OUTPUT_SIZE = 192;
    // Only return this many results.
    private static final int NUM_DETECTIONS = 1;
    // Float model
    private static final float IMAGE_MEAN = 128.0f;
    private static final float IMAGE_STD = 128.0f;
    // Number of threads in the java app
    private static final int NUM_THREADS = 4;
    private boolean isModelQuantized;
    // Config values.
    private int inputSize;
    // Pre-allocated buffers.
    private Vector labels = new Vector();
    private int[] intValues;
    // outputLocations: array of shape [Batchsize, NUM_DETECTIONS,4]
    // contains the location of detected boxes
    private float[][][] outputLocations;
    // outputClasses: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the classes of detected boxes
    private float[][] outputClasses;
    // outputScores: array of shape [Batchsize, NUM_DETECTIONS]
    // contains the scores of detected boxes
    private float[][] outputScores;
    // numDetections: array of shape [Batchsize]
    // contains the number of detected boxes
    private float[] numDetections;
    private float[][] embeedings;
    private ByteBuffer imgData;
    private Interpreter tfLite;
    // Face Mask Detector Output
    private float[][] output;
    private HashMap<String, Recognition> registered = new HashMap<>();
    @SuppressLint("LongLogTag")
    public void register(String name, Recognition rec, MainActivity det) {
    registered.put(name, rec);

     byte[] bytes = null;
     try {
    
         //  file.createNewFile();
         //write the bytes in file
         {
             Gson gson = new Gson();
    
    
             File localFile = new File(det.getFilesDir(), FileName);
             FileOutputStream fileOutputStream = new FileOutputStream(localFile);
    
             Type type = new TypeToken<HashMap<String, Recognition>>() {
             }.getType();
             String toStoreObject = gson.toJson(registered, type);
    
             ObjectOutputStream o = new ObjectOutputStream(fileOutputStream);
             o.writeObject(toStoreObject);
             //o.writeObject(registered);
    
             o.close();
             /* 26 */
             fileOutputStream.close();
    
             Toast.makeText(det.getApplicationContext(), "save file completed.", Toast.LENGTH_LONG).show();
    
             Log.d(TAG, " file created: ");
             ///     file.delete();
             Log.d(TAG, "File deleted ");
         }
    
         FirebaseStorage storage = FirebaseStorage.getInstance();
         StorageReference storageRef = storage.getReference();
         StorageReference test2 = storageRef.child(FileName);
         //test2.delete();
         //test2.putStream();
    
         Uri file = Uri.fromFile(new File(det.getFilesDir(), FileName));
    
    
         test2.putFile(file)
                 .addOnSuccessListener(new OnSuccessListener<UploadTask.TaskSnapshot>() {
                     @Override
                     public void onSuccess(UploadTask.TaskSnapshot taskSnapshot) {
                         // Get a URL to the uploaded content
                         //Uri downloadUrl = taskSnapshot.get();
                         Toast.makeText(det.getApplicationContext(), "Upload Completed.", Toast.LENGTH_LONG).show();
    
                     }
                 })
                 .addOnFailureListener(new OnFailureListener() {
                     @Override
                     public void onFailure(@NonNull Exception exception) {
                         // Handle unsuccessful uploads
                         // ...
                         Toast.makeText(det.getApplicationContext(), "Upload Failure.", Toast.LENGTH_LONG).show();
                     }
                 });
    
         Log.d(TAG, "Clique Aqui Enviou ");
    
    
     } catch (Exception e) {
    
    
         Log.d(TAG, " file created: " + e.toString());
    
         //Log.d("Clique AQUI","Clique AQUI file created: " + bytes.length);
         Toast.makeText(det.getApplicationContext(), e.getMessage(), Toast.LENGTH_LONG).show();
    
     }
    

    }
    @SuppressLint("LongLogTag")
    private TFLiteObjectDetectionAPIModel() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "TFLiteObjectDetectionAPIModel Called");
    }
    /**

    • Memory-map the model file in Assets.
      */
      @SuppressLint("LongLogTag")
      private static MappedByteBuffer loadModelFile(AssetManager assets, String modelFilename)
      throws IOException {
      Log.d("Class TFLiteObjectDetectionAPIModel :", "loadModelFile Called");
      AssetFileDescriptor fileDescriptor = assets.openFd(modelFilename);
      FileInputStream inputStream = new FileInputStream(fileDescriptor.getFileDescriptor());
      FileChannel fileChannel = inputStream.getChannel();
      long startOffset = fileDescriptor.getStartOffset();
      long declaredLength = fileDescriptor.getDeclaredLength();
      return fileChannel.map(FileChannel.MapMode.READ_ONLY, startOffset, declaredLength);
      }

    /**

    • Initializes a native TensorFlow session for classifying images.

    • @param assetManager The asset manager to be used to load assets.

    • @param modelFilename The filepath of the model GraphDef protocol buffer.

    • @param labelFilename The filepath of label file for classes.

    • @param inputSize The size of image input

    • @param isQuantized Boolean representing model is quantized or not
      */
      @SuppressLint("LongLogTag")
      public static SimilarityClassifier create(
      final AssetManager assetManager,
      final String modelFilename,
      final String labelFilename,
      final int inputSize,
      final boolean isQuantized, MainActivity det)
      throws IOException {
      final TFLiteObjectDetectionAPIModel d = new TFLiteObjectDetectionAPIModel();
      try {
      //Toast.makeText(det.getApplicationContext(), "name is null", Toast.LENGTH_LONG ).show();

       FirebaseStorage storage = FirebaseStorage.getInstance();
       StorageReference storageRef = storage.getReference();
       StorageReference test2 = storageRef.child(FileName);
      
       File localFile = File.createTempFile("Student", ".txt");
       //File localFile = new File(det.getFilesDir(),"test2.txt");
       test2.getFile(localFile).addOnSuccessListener(new OnSuccessListener<FileDownloadTask.TaskSnapshot>() {
           @Override
           public void onSuccess(FileDownloadTask.TaskSnapshot taskSnapshot) {
      
               try {
      
                   Gson gson = new Gson();
                   ObjectInputStream i = new ObjectInputStream(new FileInputStream(localFile));
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   Type type = new TypeToken<HashMap<String, Recognition>>() {}.getType();
                   HashMap<String, Recognition> registeredl = gson.fromJson((String) i.readObject(), type);
                   //HashMap<String, Recognition> registeredl = (HashMap<String, Recognition>) i.readObject();
      
                   if (registeredl != null) {
                       d.registered = registeredl;
                   }
                   i.close();
      
                   Toast.makeText(det.getApplicationContext(), ".", Toast.LENGTH_LONG).show();
                   Log.d("Clique AQUI", "Clique Aqui Adicionado " + registeredl.size());
      
               } catch (Exception e) {
                   Log.d("Clique AQUI", "Clique Aqui erro " + e.toString());
                   Toast.makeText(det.getApplicationContext(), "Exception 1" + e.getMessage(), Toast.LENGTH_LONG).show();
               }
           }
       }).addOnFailureListener(new OnFailureListener() {
           @Override
           public void onFailure(@NonNull Exception exception) {
               Log.d("Clique AQUI", "Clique Aqui erro " + exception.toString());
               Toast.makeText(det.getApplicationContext(), "Exception 2 " + exception.getMessage(), Toast.LENGTH_LONG).show();
           }
       });
      

      } catch (Exception e) {

       Log.d("Clique AQUI", "Clique AQUI file created: " + e.toString());
      

      }
      String actualFilename = labelFilename.split("file:///android_asset/")[1];
      InputStream labelsInput = assetManager.open(actualFilename);
      BufferedReader br = new BufferedReader(new InputStreamReader(labelsInput));
      String line;
      while ((line = br.readLine()) != null) {
      LOGGER.w(line);
      d.labels.add(line);
      }
      br.close();
      d.inputSize = inputSize;
      try {
      d.tfLite = new Interpreter(loadModelFile(assetManager, modelFilename));
      } catch (Exception e) {
      throw new RuntimeException(e);
      }
      d.isModelQuantized = isQuantized;
      // Pre-allocate buffers.
      int numBytesPerChannel;
      if (isQuantized) {
      numBytesPerChannel = 1; // Quantized
      } else {
      numBytesPerChannel = 4; // Floating point
      }
      d.imgData = ByteBuffer.allocateDirect(d.inputSize * d.inputSize * 3 * numBytesPerChannel);
      d.imgData.order(ByteOrder.nativeOrder());
      d.intValues = new int[d.inputSize * d.inputSize];
      d.tfLite.setNumThreads(NUM_THREADS);
      d.outputLocations = new float[1][NUM_DETECTIONS][4];
      d.outputClasses = new float[1][NUM_DETECTIONS];
      d.outputScores = new float[1][NUM_DETECTIONS];
      d.numDetections = new float[1];
      return d;
      }

    @SuppressLint("LongLogTag")
    @OverRide
    public List recognizeImage(final Bitmap bitmap, boolean storeExtra) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "recognizeImage Called");
    // Log this method so that it can be analyzed with systrace.
    Trace.beginSection("recognizeImage");

     Trace.beginSection("preprocessBitmap");
     // Preprocess the image data from 0-255 int to normalized float based
     // on the provided parameters.
     bitmap.getPixels(intValues, 0, bitmap.getWidth(), 0, 0, bitmap.getWidth(), bitmap.getHeight());
    
     imgData.rewind();
     for (int i = 0; i < inputSize; ++i) {
         for (int j = 0; j < inputSize; ++j) {
             int pixelValue = intValues[i * inputSize + j];
             if (isModelQuantized) {
                 // Quantized model
                 imgData.put((byte) ((pixelValue >> 16) & 0xFF));
                 imgData.put((byte) ((pixelValue >> 8) & 0xFF));
                 imgData.put((byte) (pixelValue & 0xFF));
             } else { // Float model
                 imgData.putFloat((((pixelValue >> 16) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat((((pixelValue >> 8) & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
                 imgData.putFloat(((pixelValue & 0xFF) - IMAGE_MEAN) / IMAGE_STD);
             }
         }
     }
     Trace.endSection(); // preprocessBitmap
    
     // Copy the input data into TensorFlow.
     Trace.beginSection("feed");
    
    
     Object[] inputArray = {imgData};
    
     Trace.endSection();
    

// Here outputMap is changed to fit the Face Mask detector Map<Integer, Object> outputMap = new HashMap<>();

    embeedings = new float[1][OUTPUT_SIZE];
    outputMap.put(0, embeedings);


    // Run the inference call.
    Trace.beginSection("run");
    //tfLite.runForMultipleInputsOutputs(inputArray, outputMapBack);
    tfLite.runForMultipleInputsOutputs(inputArray, outputMap);
    Trace.endSection();

// String res = "["; // for (int i = 0; i < embeedings[0].length; i++) { // res += embeedings[0][i]; // if (i < embeedings[0].length - 1) res += ", "; // } // res += "]";

    float distance = Float.MAX_VALUE;
    String id = "0";
    String label = "?";

    if (registered.size() > 0) {
        //LOGGER.i("dataset SIZE: " + registered.size());
        final Pair<String, Float> nearest = findNearest(embeedings[0]);
        if (nearest != null) {

            final String name = nearest.first;
            label = name;
            distance = nearest.second;

            LOGGER.i("nearest: " + name + " - distance: " + distance);


        }
    }


    final int numDetectionsOutput = 1;
    final ArrayList<Recognition> recognitions = new ArrayList<>(numDetectionsOutput);
    Recognition rec = new Recognition(
            id,
            label,
            distance,
            new RectF());

    recognitions.add(rec);

    if (storeExtra) {
        rec.setExtra(embeedings);
    }

    Trace.endSection();
    return recognitions;
}

private Pair<String, Float> findNearest(float[] emb) {

    Gson gson = new Gson();

    Pair<String, Float> ret = null;

    for (Map.Entry<String, Recognition> entry : registered.entrySet()) {
        String name = entry.getKey();

        float distance = 0;
        try {

            // original code
            //final float[] knownEmb = ((float[][]) entry.getValue().getExtra())[0];

            // -------------------- MODIFY --------------------------------------------------------------/
            float[][] knownEmb2d = gson.fromJson(entry.getValue().getExtra().toString(), float[][].class);
            final float[] knownEmb = knownEmb2d[0];

            for (int i = 0; i < emb.length; i++) {
                float diff = emb[i] - knownEmb[i];
                distance += diff * diff;
            }
        } catch (Exception e) {
            //Toast.makeText(context, e.getMessage(), Toast.LENGTH_LONG ).show();
            Log.e("findNearest", e.getMessage());
        }
        distance = (float) Math.sqrt(distance);
        if (ret == null || distance < ret.second) {
            ret = new Pair<>(name, distance);
        }
    }

    return ret;
}

@SuppressLint("LongLogTag")
@Override
public void enableStatLogging(final boolean logStats) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "enableStatLogging Called");
}

@SuppressLint("LongLogTag")
@Override
public String getStatString() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "getStatString Called");
    return "";
}

@SuppressLint("LongLogTag")
@Override
public void close() {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "close Called");
}

@SuppressLint("LongLogTag")
public void setNumThreads(int num_threads) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setNumThreads Called");
    if (tfLite != null) tfLite.setNumThreads(num_threads);
}

@SuppressLint("LongLogTag")
@Override
public void setUseNNAPI(boolean isChecked) {
    Log.d("Class TFLiteObjectDetectionAPIModel :", "setUseNNAPI Called");
    if (tfLite != null) tfLite.setUseNNAPI(isChecked);
}

}

Hello Please how did you structure your Main Activity. Because i keep getting Attempt to invoke Virtual method. Please help . Its really Urgent

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants