Kittipat Apicharttrisorn, Jiasi Chen, Vyas Sekar, Anthony Rowe, Srikanth V. Krishnamurthy
ACM Conference on Embedded Networked Sensor Systems (SenSys '22), November 6--9, 2022, Boston, MA, USA
https://drive.google.com/drive/folders/1WyiWeoP-Bx1RKkmZyc-zU-a4Eubke27O?usp=sharing
Instructions
Main Subfolders
./VINS_Mobile_AndroidPort/
Main Android project
./android_files/
Push its subfolders to the phone
$ adb push VINS /sdcard/
$ adb push tensorflow /sdcard/
./efficientdet/
EfficientDet with custom class to generate ground truth (see below)
Prerequisites
Tested devices: Samsung S21, Google Pixel 5, Google Pixel 4a 5G, Google Pixel 4
Tested SDK version: 29
Device Calibration
User RPNG camera calibration https://github.com/rpng/android-camera-calibration (set resolution to 640 x 480)
Put calibrated intrinsic parameters in global_param.cpp
FOCUS_LENGTH_X = ...;
FOCUS_LENGTH_Y = ...;
Required Library files
Tested OpenCV version 3.4.0 with contrib modules, -> download and add path in below CMakeLists.
VINS_Mobile_AndroidPort/app/libs/VINS-Mobile-master/CMakeLists.txt
VINS_Mobile_AndroidPort/app/src/main/cpp/VINS_Android/CMakeLists.txt
Tested Boost version 1.64.0, bzip2 1.0.6 and zlib 1.2.11, -> download and add path in below CMakeLists.
VINS_Mobile_AndroidPort/app/CMakeLists.txt
Tested Tensorflow version 2.5.0, -> download and add path in below CMakeLists
VINS_Mobile_AndroidPort/app/libs/VINS-Mobile-master/jni/CMakeLists.txt
Tested NDK version 14b (android-ndk-r14b)
Choose algorithm to run and set important parameters (./VINS_Mobile_AndroidPort/)
In MainActivity.java, set algorithm you want to test by just changing one parameter.
// Set systems to be evaluated here
// 1: COLLAR (an alias for FreeAR)
// 2: Vanilla
// 3: MARVEL
// 4: (not reported)
// 5: MARLIN (make sure to set MARLIN_POLICY = true;)
// 6: MARVEL (for testing coordinate systems synchronization (Section 6.3))
private static final int CAR_SYSTEMS_INDEX = 1;
Set expected number of AR participants in
final private int expectedNumberPlayers = 4;
In addKnownPeers() function, set Android's unique device ID (uid) to prevent WiFi P2P connection attempts with other unrelated P2P devices. If you don't know, just read from below (it will crash for the first time but after updating this uid in addKnownPeers(), it will succeed.
android_id = Secure.getString(this.getContentResolver(), Secure.ANDROID_ID);
Scripts to process saved logs (./efficientdet)
Prerequisites
Install ffmpeg
Create venv and install required packages for EfficientDet (see install_venv_packages.txt)
model_inspect_generate_gt.py
custom class to process images/video to generate ground truth (bbox and class) using EfficientDet models
EfficientDet7x model file is included in ./efficientdet/
get_power_xxx_yyy.py
automate saved log processing to output average power consumption
where xxx is role (primary or secondary) and yyy is system (COLLAR, MARVEL, MARLIN, VANILLA)
get_accuracy_xxx_yyy.py
automate saved log processing to output average IOU accuracy
sample command line to generate ground truth and get power and accuracy for COLLAR
set efficientdet path to EFFDET_FOLDER
$ export EFFDET_FOLDER=path_to_efficientdet_folder
set experiment_log (saved from phone) path to LOG_FOLDER
$ export LOG_FOLDER=path_to_log_folder
set experiment_log/frames path to FRAMES_FOLDER
$ export FRAMES_FOLDER=path_to_frames_folder
cd $EFFDET_FOLDER && python3 rename_frames.py $FRAMES_FOLDER \\ (1)
&& cd $FRAMES_FOLDER && ffmpeg -framerate 30 -i %06d.png -codec copy video.mkv \\ (2)
&& cd $EFFDET_FOLDER && python3 model_inspect_generate_gt.py --runmode=saved_model_video --model_name=efficientdet-d7x --saved_model_dir=saved_model --input_video=$FRAMES_FOLDER/video.mkv --output_video=$FRAMES_FOLDER/object_detected.mov \\ (3)
&& python3 get_power_secondary_collar.py $LOG_FOLDER && python3 get_accuracy_secondary_collar.py $LOG_FOLDER (4)
(1) raw frames saved from the phone are named by unix timestamp, so it is sorted here and renamed to be sequential
(2) convert frames into a video (uncompressed) to run EfficientDet's video input mode
(3) feed video input into EfficientDet using a script modified to generate ground truth and write it as a local file
(4) run the get_power_xxx.py and get_accuracy_xxx.py to print out power consumption and IOU accuracy. Note that the former script has to be executed before the latter.